java.lang.nosuchmethoderror:scala.predef$.arrowassoc(ljava/lang/object;)ljava/lang/object

sulc1iza  于 2021-06-25  发布在  Flink
关注(0)|答案(1)|浏览(567)

我和Flink和Kafka一起工作,我犯了这个错误。

java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
    at kafka.consumer.FetchRequestAndResponseMetrics.<init>(FetchRequestAndResponseStats.scala:32)
    at kafka.consumer.FetchRequestAndResponseStats.<init>(FetchRequestAndResponseStats.scala:46)
    at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
    at kafka.consumer.FetchRequestAndResponseStatsRegistry$$anonfun$2.apply(FetchRequestAndResponseStats.scala:59)
    at kafka.utils.Pool.getAndMaybePut(Pool.scala:61)
    at kafka.consumer.FetchRequestAndResponseStatsRegistry$.getFetchRequestAndResponseStats(FetchRequestAndResponseStats.scala:63)
    at kafka.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:39)
    at kafka.javaapi.consumer.SimpleConsumer.<init>(SimpleConsumer.scala:34)
    at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.getPartitionsForTopic(FlinkKafkaConsumer.java:695)
    at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer.<init>(FlinkKafkaConsumer.java:281)
    at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer082.<init>(FlinkKafkaConsumer082.java:49)
    at com.inndata.flinkkafka.ReadFromKafka.main(ReadFromKafka.java:19)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:505)
    at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:403)
    at org.apache.flink.client.program.Client.runBlocking(Client.java:248)
    at org.apache.flink.client.CliFrontend.executeProgramBlocking(CliFrontend.java:866)
    at org.apache.flink.client.CliFrontend.run(CliFrontend.java:333)
    at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1192)
    at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1243)

我使用的是flink-1.0.3版本和kafka-kafka_2.11-0.8.2.1以及scala-2.11.5,这些是我在构建路径中使用的jar:

asm-4.0.jar,commons-codec-1.6.jar,commons-exec-1.1.jar,commons-fileupload-1.2.1.jar,commons-io-2.4.jar,commons-lang-2.5.jar,commons-logging-1.1.3.jar,commons-logging-api-1.1.jar,curator-test-3.2.0.jar,disruptor-2.10.1.jar,flink-annotations-1.0.3.jar,flink-clients_2.10-1.0.3.jar,flink-clients_2.10-1.0.3-tests.jar,flink-core-1.0.3.jar,flink-dist_2.10-1.0.3.jar,flink-java-1.0.3.jar,flink-optimizer_2.10-1.0.3.jar,flink-python_2.10-1.0.3.jar,flink-runtime_2.10-1.0.3.jar,flink-test-utils_2.10-1.0.3.jar,guava-11.0.2.jar,hadoop-client-2.6.0.jar,hadoop-mapreduce-client-core-2.6.0.jar,hamcrest-all-1.3.jar,jetty-6.1.26.jar,jetty-util-6.1.26.jar,json-simple-1.1.1.jar,junit-4.11.jar,log4j-1.2.17.jar,logback-classic-1.0.13.jar,logback-core-1.0.13.jar,slf4j-api-1.7.7.jar,slf4j-log4j12-1.7.7.jar,zkclient-0.3.jar,zookeeper-3.4.6.jar,flink-connector-kafka-0.10.2.jar,flink-streaming-java-0.10.2.jar,kafka-clients-0.8.2.1.jar,kafka_2.11-0.8.2.1-test.jar,kafka_2.11-0.8.2.1-sources.jar,kafka_2.11-0.8.2.1-scaladoc.jar,kafka_2.11-0.8.2.1-javadoc.jar,kafka_2.11-0.8.2.1.jar,scala-xml_2.11-1.0.2.jar,scala-parser-combinators_2.11-1.0.2.jar,scala-library-2.11.5.jar,metrics-core-2.2.0.jar

我试图搜索此问题,但找不到相关的jar。请帮助我解决此问题。

s4chpxco

s4chpxco1#

有几个jar可能是问题的原因。 scala-library-2.11.5.jar 建议您使用scala 2.11,这似乎与 kafka_2.11-0.8.2.1.jar .
但是,在你的cp里,我明白了 flink-clients_2.10-1.0.3.jar 以及 flink-runtime_2.10-1.0.3.jar 在其他中。这些是针对scala 2.10编译的scala库。这就是问题所在。
scala中的二进制文件在主要版本之间不兼容。如果存在这些jar,您需要找到它们的2.11版本,或者自己将它们编译成2.11版本。

相关问题