安装后 jdk9
我看到了这个问题:
$hive
Java HotSpot(TM) 64-Bit Server VM warning: Ignoring option MaxPermSize; support was removed in 8.0
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/Cellar/hive/2.3.1/libexec/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/local/Cellar/hadoop/2.8.1/libexec/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Exception in thread "main" java.lang.ClassCastException: java.base/jdk.internal.loader.ClassLoaders$AppClassLoader cannot be cast to java.base/java.net.URLClassLoader
at org.apache.hadoop.hive.ql.session.SessionState.<init>(SessionState.java:394)
at org.apache.hadoop.hive.ql.session.SessionState.<init>(SessionState.java:370)
at org.apache.hadoop.hive.cli.CliSessionState.<init>(CliSessionState.java:60)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:708)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:686)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at org.apache.hadoop.util.RunJar.run(RunJar.java:234)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
但我有
更新 $PATH
指向java8
$java -version
java version "1.8.0_144"
Java(TM) SE Runtime Environment (build 1.8.0_144-b01)
Java HotSpot(TM) 64-Bit Server VM (build 25.144-b01, mixed mode)
更新 hive
指定java8的可执行文件
vi $(which hive)
# !/bin/bash
JAVA_HOME="$(/usr/libexec/java_home --version 1.8)" HIVE_HOME="/usr/local/Cellar/hive/2.3.1/libexec" exec "/usr/local/Cellar/hive/2.3.1/libexec/bin/hive" "$@"
验证了更新的java版本是否指向jdk8
$/usr/libexec/java_home --version 1.8
/Library/Java/JavaVirtualMachines/jdk1.8.0_144.jdk/Contents/Home
我还应该调查什么?
这是 hive 2.3.1
在 macos
```
$hive --version
Hive 2.3.1
Git git://jcamachorodriguez-rMBP.local/Users/jcamachorodriguez/src/workspaces/hive/HIVE-apache/hive -r 7590572d9265e15286628013268b2ce785c6aa08
Compiled by jcamachorodriguez on Thu Oct 19 18:37:58 PDT 2017
From source with checksum 03c91029a6103bd91f25a6ff8a01fbcd
3条答案
按热度按时间dnph8jn41#
我有同样的问题,我只是删除jdk9而不是改变环境,问题就解决了。
看Hive,它用
uttx8gqw2#
我在设置配置单元时遇到了相同的错误,最初我认为可能是因为不同的java版本。
但在检查java版本时,它是JDK1.8。
最后,在检查jdk安装目录(/library/java/javavirtualmachine)时,我发现jdk1.8和jdk10都存在
我删除了JDK10,最后它成功了。
yshpjwxd3#
安装jdk8并在hadoop-env.sh中相应地更改路径,这对我起到了作用。