Spark无效类异常

kmpatx3s  于 2023-01-02  发布在  Apache
关注(0)|答案(1)|浏览(182)

我尝试直接从intellij发送代码到Spark master:

def main(args: Array[String]) = {
    println("***hello spark-test***")
    val spark = SparkSession
      .builder()
      .master("spark://172.22.208.1:7077")
      .appName("Spark-Test Application")
      .getOrCreate()
    import spark.implicits._
    val rawData = Array(1, 2, 3, 4, 5, 6, 7, 8, 9, 10)
    val rdd = spark.sparkContext.parallelize(rawData)
    val df = rdd.toDF()
    val result = df.filter($"value" % 2 === 1).count()
    println(s"***Result odd numbers count: $result ***")
    spark.stop()
  }

结果:
应用程序日志在连接到主服务器时挂起:

22/12/29 12:38:55 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://172.22.208.1:7077...
22/12/29 12:38:55 INFO TransportClientFactory: Successfully created connection to /172.22.208.1:7077 after 38 ms (0 ms spent in bootstraps)
22/12/29 12:39:15 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://172.22.208.1:7077...

驱动程序日志:

22/12/29 12:39:35 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() for one-way message.
java.io.InvalidClassException: scala.collection.immutable.ArraySeq; local class incompatible: stream classdesc serialVersionUID = -8615987390676041167, local class serialVersionUID = 2701977568115426262

spark version : spark-3.3.0-hadoop3-scala2.13
intellij的scala版本:版本2.13.5
然而,当我删除"master(" spark://172.22.208.1:7077 ")"行并构建一个jar并通过spark-submit发送它时,它工作正常。

envsm3lx

envsm3lx1#

通常这个错误意味着您运行的不是同一个Spark版本或Scala版本。
您需要确保在Intellij上运行的代码与在集群上运行的代码在相同的ScalaSpark版本上运行。
既然你提到你在IntelliJ上运行Scala 2.13,我猜你的本地Spark版本和集群上的版本不一样。

相关问题