spark2.0.2 classnotfoundexception:org.apache.kafka.clients.consumer.consumer

xjreopfe  于 2021-06-07  发布在  Kafka
关注(0)|答案(2)|浏览(420)

下面是我的pom.xml。我用maven shade做jar。我非常肯定org.apache.kafka.clients.consumer.consumer包含在我的uberjar中。我还将kafka-clients-0.10.1.0.jar放入spark.driver.extralibrarypath。我还尝试在spark submit命令中添加--jars选项。但我还是得到了classnotfoundexception。

<dependencies>
            <dependency>
                <groupId>org.scala-lang</groupId>
                <artifactId>scala-reflect</artifactId>
                <version>2.11.8</version>
            </dependency>
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-core_2.11</artifactId>
                <version>2.0.2</version>
            </dependency>
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-streaming_2.11</artifactId>
                <version>2.0.2</version>
            </dependency>
            <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
                <version>2.0.2</version>
            </dependency>
            <dependency>
                <groupId>org.apache.kafka</groupId>
                <artifactId>kafka_2.11</artifactId>
                <version>0.10.1.0</version>
            </dependency>
            <dependency>
                <groupId>junit</groupId>
                <artifactId>junit</artifactId>
                <version>3.8.1</version>
                <scope>test</scope>
            </dependency>
        </dependencies>

o2gm4chl

o2gm4chl1#

我只是找到了一个解决办法。把jar放进 SPARK_HOME/jars . 我用 spark-submit 命令。尝试添加 --jars,--driver-library-path . 我确信这些选择会生效。但仍然 classNotFound . 我根据下面列出的驱动程序日志找到旁路解决方案。
在此处输入图像描述

gstyhher

gstyhher2#

基本上,你需要:

<dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka-clients</artifactId>
    <version>0.10.1.0</version>
</dependency>

相关问题