spark2.4.6:发生了jni错误

czq61nw1  于 2021-05-27  发布在  Spark
关注(0)|答案(0)|浏览(404)

在我的windows机器上,我正在尝试使用spark 2.4.6而不使用hadoop,使用-spark-2.4.6-bin-without-hadoop-scala-2.12.tgz
在设置了spark\u home、hadoop\u home和spark\u dist\u类路径之后,这里链接了文章中的信息
当我试图启动Spark壳,我得到这个错误-

Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
        at java.lang.Class.getDeclaredMethods0(Native Method)
        at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
        at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
        at java.lang.Class.getMethod0(Class.java:3018)
        at java.lang.Class.getMethod(Class.java:1784)
        at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
        at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        ... 7 more

上面引用的链接似乎指向spark\u dist\u类路径,但我的系统变量中已经有了这个链接-
$hadoop\u主页$hadoop_home\etc\hadoop*$hadoopïhome\share\hadoop\common\lib*$hadoop\u主页\共享\ hadoop\common*$hadoopïhome\share\hadoop\hdfs*$hadoop\u home\share\hadoop\hdfs\lib*$hadoopïhome\share\hadoop\hdfs*$hadoopïhome\share\hadoop\yarn\lib*$hadoopïhome\share\hadoop\yarn*$hadoopèhome\share\hadoop\mapreduce\lib*$hadoopïhome\share\hadoop\mapreduce*$hadoopïhome\share\hadoop\tools\lib*;
我在spark的spark-env.sh中也有这一行-

export SPARK_DIST_CLASSPATH=$(C:\opt\spark\hadoop-2.7.3\bin\hadoop classpath)
HADOOP_HOME = C:\opt\spark\hadoop-2.7.3
SPARK_HOME = C:\opt\spark\spark-2.4.6-bin-without-hadoop-scala-2.12

当我尝试hadoop附带的spark2.4.5时,它似乎运行得很好。这告诉我hadoop的设置方式有问题。我错过了什么?谢谢!

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题