我试图运行Spark作业使用Yarn,但得到以下错误
java.lang.NoSuchMethodError: com.google.common.util.concurrent.Futures.withFallback(Lcom/google/common/util/concurrent/ListenableFuture;Lcom/google/common/util/concurrent/FutureFallback;Ljava/util/concurrent/Executor;)Lcom/google/common/util/concurrent/ListenableFuture;
at com.datastax.driver.core.Connection.initAsync(Connection.java:176)
at com.datastax.driver.core.Connection$Factory.open(Connection.java:721)
at com.datastax.driver.core.ControlConnection.tryConnect(ControlConnection.java:248)
at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:194)
at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:82)
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1307)
at com.datastax.driver.core.Cluster.init(Cluster.java:159)
at com.datastax.driver.core.Cluster.connect(Cluster.java:249)
at com.figmd.processor.ProblemDataloader$ParseJson.call(ProblemDataloader.java:46)
at com.figmd.processor.ProblemDataloader$ParseJson.call(ProblemDataloader.java:34)
at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:140)
at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:140)
at org.apache.spark.rdd.RDD$$anonfun$14.apply(RDD.scala:618)
at org.apache.spark.rdd.RDD$$anonfun$14.apply(RDD.scala:618)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:280)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:247)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
at org.apache.spark.scheduler.Task.run(Task.scala:56)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:200)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
集群详细信息:spark 1.2.1,hadoop2.7.1我使用spark.driver.extraclasspath提供了类路径。hadoop用户也可以访问该类路径,但我认为yarn没有在该类路径上获得jar。我无法找到它的根本原因。任何帮助将不胜感激。
谢谢。
3条答案
按热度按时间knpiaxh11#
问题与Guava版本不匹配有关。
withFallback
被添加到Guava的第14版中。看起来你的课堂上有Guava<142w2cym1i2#
添加到@arjones answer,如果您正在使用gradle+gradleshadow,您可以将此添加到build.gradle以重新定位或重命名guava类。
bfnvny8b3#
我也面临同样的问题,解决办法是避免用Guava遮荫
classpath
碰撞。如果您使用sbt assembly来构建jar,您只需将其添加到
build.sbt
:我写了一篇博客文章,描述了我实现这个解决方案的过程:让hadoop2.6+sparkcassandra驱动程序一起玩得很好。
希望有帮助!