将spark升级到2.4.5时出现nosuchmethoderror

vyu0f0g1  于 2021-07-13  发布在  Spark
关注(0)|答案(1)|浏览(433)

在我们的项目中将spark版本升级到2.4.5(作为一般1.5.1产品升级的一部分)之后,在scoring batch模块中运行测试套件时出现以下错误。我们的build.gradle文件与项目示例中的文件相当一致。

java.util.concurrent.ExecutionException: java.lang.NoSuchMethodError: org.apache.spark.sql.internal.StaticSQLConf$.DEFAULT_URL_STREAM_HANDLER_FACTORY_
ENABLED()Lorg/apache/spark/internal/config/ConfigEntry;
  at java.util.concurrent.FutureTask.report(FutureTask.java:122)
  at java.util.concurrent.FutureTask.get(FutureTask.java:192)
  at org.scalatest.tools.ConcurrentDistributor.waitUntilDone(ConcurrentDistributor.scala:50)
  at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1299)
  at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:972)
  at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:971)
  at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1474)
  at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:971)
  at org.scalatest.tools.Runner$.main(Runner.scala:775)
  at org.scalatest.tools.Runner.main(Runner.scala)
  Cause: java.lang.NoSuchMethodError: org.apache.spark.sql.internal.StaticSQLConf$.DEFAULT_URL_STREAM_HANDLER_FACTORY_ENABLED()Lorg/apache/spark/interna
l/config/ConfigEntry;
  at org.apache.spark.sql.internal.SharedState$.org$apache$spark$sql$internal$SharedState$$setFsUrlStreamHandlerFactory(SharedState.scala:165)
  at org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:45)
  at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:121)
  at org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:121)
  at scala.Option.getOrElse(Option.scala:121)
  at org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:121)
  at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:120)
  at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:286)
  at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1115)
  at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:145)
  at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:144)
  at scala.Option.getOrElse(Option.scala:121)
  at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:144)
  at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:141)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$applyModifiableSettings$1.apply(SparkSession.scala:974)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$applyModifiableSettings$1.apply(SparkSession.scala:974)
  at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:130)
  at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:130)
  at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:236)
  at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
  at scala.collection.mutable.HashMap.foreach(HashMap.scala:130)
  at org.apache.spark.sql.SparkSession$Builder.applyModifiableSettings(SparkSession.scala:974)
  at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:907)
  at com.quantexa.analytics.test.SparkTestSuite.spark$lzycompute(SparkTestSuite.scala:27)
  at com.quantexa.analytics.test.SparkTestSuite.spark(SparkTestSuite.scala:22)
  at com.quantexa.scb.trade.scoring.batch.utils.ScoringGeneratedDataTestSuite$class.beforeAll(ScoringGeneratedDataTestSuite.scala:38)
  at com.quantexa.scb.trade.scoring.batch.utils.EndToEndNetworkBuildTest.beforeAll(EndToEndNetworkBuildTest.scala:12)
  at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
  at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
  at com.quantexa.analytics.test.SparkTestSuite.run(SparkTestSuite.scala:19)
  at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
  at java.util.concurrent.FutureTask.run(FutureTask.java:266)
  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
  at java.lang.Thread.run(Thread.java:748)
sqxo8psd

sqxo8psd1#

这个问题是由于2.4.5和2.4.6之间的spark版本冲突造成的,因为其他库引入了2.4.6。为了解决这个问题,spark库必须固定在2.4.5版本,相关build.gradle文件中做了以下更改:

provided("org.apache.spark:spark-core_$scalaVersion:$spark"){force = true}
provided("org.apache.spark:spark-sql_$scalaVersion:$spark"){force = true}
provided("org.apache.spark:spark-yarn_$scalaVersion:$spark"){force = true}
provided("org.apache.spark:spark-hive_$scalaVersion:$spark"){force = true}
provided("org.apache.spark:spark-graphx_$scalaVersion:$spark"){force = true}
provided("org.apache.spark:spark-mllib_$scalaVersion:$spark"){force = true}

相关问题