线程“main”java.lang.noclassdeffounderror中的ssh异常:org/apache/hadoop/fs/fsdatainputstream

dgtucam1  于 2021-05-27  发布在  Hadoop
关注(0)|答案(1)|浏览(460)

我是新的使用服务器在我的电脑我没有任何问题使用apachespark。通常我使用intellij来运行代码。
我尝试在外部服务器ssh中运行项目,但出现错误:
线程“main”java.lang.noclassdeffounderror中出现异常:org/apache/hadoop/fs/fsdatainputstream位于org.apache.spark.sql.sparksession$.org$apache$spark$sql$sparksession$$assertondriver(sparksession)。scala:1086)在org.apache.spark.sql.sparksession$builder.getorcreate(sparksession。scala:902)在com.p53.main(p53。java:42)在sun.reflect.nativemethodaccessorimpl.invoke0(本机方法)位于sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl)。java:62)在sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl。java:43)在java.lang.reflect.method.invoke(方法。java:498)在com.intellij.rt.execution.application.appmainv2.main(appmainv2。java:131)原因:java.lang.classnotfoundexception:org.apache.hadoop.fs.fsdatainputstream位于java.net.urlclassloader.findclass(urlclassloader)。java:381)在java.lang.classloader.loadclass(classloader。java:424)在sun.misc.launcher$appclassloader.loadclass(launcher。java:331)在java.lang.classloader.loadclass(类加载器。java:357) ... 8个以上
当我在终端(/usr/local/spark/bin/sparkshell)中运行时,spark运行得很好。
我的pom依赖项是:

dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.4.3</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.11</artifactId>
            <version>2.4.3</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.11</artifactId>
            <version>2.4.3</version>
            <scope>runtime</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-hdfs</artifactId>
            <version>3.2.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-common</artifactId>
            <version>3.2.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>3.2.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-catalyst_2.11</artifactId>
            <version>2.4.3</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-mllib_2.11</artifactId>
            <version>2.4.3</version>
        </dependency>
    </dependencies>

pom插件:

plugins>
            <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <version>3.5.1</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin>
            <plugin>
                <artifactId>maven-jar-plugin</artifactId>
                <version>3.0.2</version>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                    <archive>
                        <manifest>
                            <mainClass>Main</mainClass>
                        </manifest>
                    </archive>
                </configuration>
            </plugin>
        </plugins>

我知道我做错了什么,或者遗漏了什么,但我就是不知道问题出在哪里。

krugob8w

krugob8w1#

您需要设置spark\u dist\u类路径。

export SPARK_DIST_CLASSPATH=`hadoop classpath`

相关问题