Flink-connector-Kafka giving原因:java.lang.NoClassDefFoundError:Could not initialize class org.apache.Kafka.clients.admin.AdminClientConfig [duplicate]

sgtfey8w  于 12个月前  发布在  Apache
关注(0)|答案(1)|浏览(218)

此问题在此处已有答案

Why am I getting a NoClassDefFoundError in Java?(32个回答)
19天前关闭
我的工作项目与flink-kafka-connector和使用下面的依赖在我的pom

<dependency>
   <groupId>org.apache.flink</groupId>
   <artifactId>flink-connector-kafka</artifactId>
   <version>3.0.1-1.18</version>
</dependency>

字符串
每当我提交我的工作,我得到 * 无法初始化类org.apache.Kafka.clients.admin.AdminClientConfig*
stacktrace如下

org.apache.flink.util.FlinkException: Global failure triggered by OperatorCoordinator for 'Source: Kafka - Source -> Flat Map -> Sink: Writer -> Sink: Committer' (operator 61252f73469d3ffba207c548d29a0267).
    at org.apache.flink.runtime.operators.coordination.OperatorCoordinatorHolder$LazyInitializedCoordinatorContext.failJob(OperatorCoordinatorHolder.java:624)
    at org.apache.flink.runtime.operators.coordination.RecreateOnResetOperatorCoordinator$QuiesceableContext.failJob(RecreateOnResetOperatorCoordinator.java:248)
    at org.apache.flink.runtime.source.coordinator.SourceCoordinatorContext.failJob(SourceCoordinatorContext.java:395)
    at org.apache.flink.runtime.source.coordinator.SourceCoordinator.lambda$runInEventLoop$10(SourceCoordinator.java:483)
    at org.apache.flink.util.ThrowableCatchingRunnable.run(ThrowableCatchingRunnable.java:40)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.kafka.clients.admin.AdminClientConfig
    at org.apache.kafka.clients.admin.Admin.create(Admin.java:133)
    at org.apache.kafka.clients.admin.AdminClient.create(AdminClient.java:39)
    at org.apache.flink.connector.kafka.source.enumerator.KafkaSourceEnumerator.getKafkaAdminClient(KafkaSourceEnumerator.java:410)
    at org.apache.flink.connector.kafka.source.enumerator.KafkaSourceEnumerator.start(KafkaSourceEnumerator.java:151)
    at org.apache.flink.runtime.source.coordinator.SourceCoordinator.lambda$start$1(SourceCoordinator.java:233)
    at org.apache.flink.runtime.source.coordinator.SourceCoordinator.lambda$runInEventLoop$10(SourceCoordinator.java:469)
    ... 8 more


我已经检查了没有任何冲突的依赖关系。所有需要的依赖关系都打包在我的jar中。有人能建议我在这里做错了什么吗?
提前感谢!!

1dkrff03

1dkrff031#

对于ClassNotFoundException错误,这几乎肯定是一个问题,当你在部署作业之前打包它(例如编译成jar/shaded jar)时,你给定的Flink应用程序没有编译一个或多个预期的依赖项。由于它们没有打包在应用程序中,当你的代码引用类时,你会在运行时遇到这些错误。
正如你提到的,你引用了适当的Kafka连接器依赖(flink-connector-kafka),但是你可能希望确保Kafka客户端库本身作为依赖(kafka-clients)添加并且可用,并且两者都使用compile范围:

<!-- Connector Dependency -->
<dependency>
   <groupId>org.apache.flink</groupId>
   <artifactId>flink-connector-kafka</artifactId>
   <version>${kafka.version}</version>
   <scope>compile</scope>
</dependency>
<!-- Client Dependency -->
<dependency>
   <groupId>org.apache.kafka</groupId>
   <artifactId>kafka-clients</artifactId>
   <version>${kafka.version}</version>
   <scope>compile</scope>
</dependency>

字符串
我知道compile作用域是默认的,不应该是必要的,但是我看到过在构建基于Flink的jar时似乎需要它的情况,所以我建议在删除它之前先显式定义它。

相关问题