Apache Spark -快速入门“java.lang.NoClassDefFoundError:scala/可序列化”

bakd9h0s  于 2022-12-04  发布在  Apache
关注(0)|答案(2)|浏览(283)

我试着按照https://spark.apache.org/docs/latest/quick-start.html(scala)的指导,但是,当我应该把jar文件提交给spark的时候,我不能完成最后一步。

# Use spark-submit to run your application
$ YOUR_SPARK_HOME/bin/spark-submit \
  --class "SimpleApp" \
  --master local[4] \
  target/scala-2.12/simple-project_2.12-1.0.jar

我得到以下异常

Exception in thread "main" java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/Serializable
        at SimpleApp$.main(SimpleApp.scala:9)
        at SimpleApp.main(SimpleApp.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError: scala/Serializable
        ... 14 more
Caused by: java.lang.ClassNotFoundException: scala.Serializable
        at java.net.URLClassLoader.findClass(Unknown Source)
        at java.lang.ClassLoader.loadClass(Unknown Source)
        at java.lang.ClassLoader.loadClass(Unknown Source)
        ... 14 more

知道是什么导致的吗?

wlwcrazw

wlwcrazw1#

您需要将依赖项升级到与您的Scala版本二进制兼容的版本。在本例中看起来像2.12

b1zrtrql

b1zrtrql2#

我发现了问题,我安装了错误的Spark版本,我下载了“Pre-built for Apache Hadoop 3.3 and later(scala 2.13)”版本,安装了“Pre-built for Apache Hadoop 3.3 and later”spark版本解决了问题。

相关问题