spark kafka流媒体问题

jogvjijk  于 2021-06-07  发布在  Kafka
关注(0)|答案(2)|浏览(384)

我用的是maven
我添加了以下依赖项

<dependency> <!-- Spark dependency -->
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming_2.10</artifactId>
      <version>1.1.0</version>
    </dependency>   <dependency> <!-- Spark dependency -->
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming-kafka_2.10</artifactId>
      <version>1.1.0</version>
    </dependency>

我还在代码中添加了jar

SparkConf sparkConf = new SparkConf().setAppName("KafkaSparkTest");
JavaSparkContext sc = new JavaSparkContext(sparkConf);
sc.addJar("/home/test/.m2/repository/org/apache/spark/spark-streaming-kafka_2.10/1.0.2/spark-streaming-kafka_2.10-1.0.2.jar");
JavaStreamingContext jssc = new JavaStreamingContext(sc, new Duration(5000));

完全没有任何错误,我在运行spark submit时遇到以下错误,非常感谢您的帮助。谢谢你的时间。

bin/spark-submit --class "KafkaSparkStreaming" --master local[4] try/simple-project/target/simple-project-1.0.jar

线程“main”java.lang.noclassdeffounderror中出现异常:org/apache/spark/streaming/kafka/kafkautils at kafkasparkstreaming.sparkstreamingtest(kafkasparkstreaming)。java:40)在kafkasparakstreaming.main(kafkasparakstreaming。java:23)位于sun.reflect.nativemethodaccessorimpl.invoke0(本机方法)sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl。java:57)在sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl。java:43)在java.lang.reflect.method.invoke(方法。java:606)在org.apache.spark.deploy.sparksubmit$.launch(sparksubmit。scala:303)在org.apache.spark.deploy.sparksubmit$.main(sparksubmit。scala:55)在org.apache.spark.deploy.sparksubmit.main(sparksubmit.scala)由以下原因引起:java.lang.classnotfoundexception:org.apache.spark.streaming.kafka.kafkautils at java.net.urlclassloader$1.run(urlclassloader)。java:366)

chy5wohz

chy5wohz1#

我遇到了同样的问题,我通过构建具有依赖关系的jar来解决它。
删除代码中的“sc.addjar()”。
将下面的代码添加到pom.xml

<build>
    <sourceDirectory>src/main/java</sourceDirectory>
    <testSourceDirectory>src/test/java</testSourceDirectory>
    <plugins>
      <!--
                   Bind the maven-assembly-plugin to the package phase
        this will create a jar file without the storm dependencies
        suitable for deployment to a cluster.
       -->
      <plugin>
        <artifactId>maven-assembly-plugin</artifactId>
        <configuration>
          <descriptorRefs>
            <descriptorRef>jar-with-dependencies</descriptorRef>
          </descriptorRefs>
          <archive>
            <manifest>
              <mainClass></mainClass>
            </manifest>
          </archive>
        </configuration>
        <executions>
          <execution>
            <id>make-assembly</id>
            <phase>package</phase>
            <goals>
              <goal>single</goal>
            </goals>
          </execution>
        </executions>
      </plugin>
    </plugins>
</build>

打包
提交“example jar with dependencies.jar”

2skhul33

2skhul332#

为了将来的参考,如果您得到一个classnotfoundexception,如果您搜索“org.apache.spark…”,您将被带到maven页面,在那里它将告诉您pom文件中缺少的依赖项。它还将给你的代码,把你的pom文件。

相关问题