scala Spark Session的问题

h22fl7wq  于 2022-11-09  发布在  Scala
关注(0)|答案(1)|浏览(164)

我有这个错误
“线程中出现异常”Main“java.lang.NoSuchMethodError:‘org.apache.spark.internal.Logging.$init$(org.apache.spark.internal.Logging)’at org.apache.spark.sql.SparkSession$.(SparkSession.scala:787)at org.apache.spark.sql.SparkSession$.(SparkSession.scala)”
当我试图执行

val conf = new SparkConf().setMaster("local[*]").setAppName("MTBD")

val spark = SparkSession
    .builder()
    .appName("MTBD")
    .config(conf)
    .getOrCreate()"

如果我注解掉spark= SparkSession和ETC语句。不会有任何错误。
我在用IntelliJ
其他信息:

scala -version
Scala code runner version 2.11.12

spark version 3.2.2

build.sbt 
"ThisBuild / version := "0.1.0-SNAPSHOT"

ThisBuild / scalaVersion := "2.11.12"

lazy val root = (project in file("."))
  .settings(
    name := "provaMTBD"
  )

libraryDependencies ++= Seq( "org.apache.spark" % "spark-core_2.11" % "2.1.0")"

编辑:在Spark-Shell中,我没有错误,我还可以读取CSV。
这是Spark-Shell的输出:

val  spark =  SparkSession.builder().appName("MTBD").config(conf).getOrCreate()
22/10/15 01:14:25 WARN SparkSession$Builder: Using an existing SparkSession; some spark core configurations may not take effect.
spark: org.apache.spark.sql.SparkSession = org.apache.spark.sql.SparkSession@5fbcc71d
92dk7w1h

92dk7w1h1#

我认为Spark和Scala的版本都搞混了。
如果您想要我们触发3.2.2,那么尝试使用Scala 2.12,如下所示:

libraryDependencies += "org.apache.spark" % "spark-core_2.12" % "3.2.2"

编号:https://search.maven.org/artifact/org.apache.spark/spark-core_2.12/3.2.2/jar

相关问题