带spark scala和sbt的intellij设置

zwghvu4y  于 2021-07-14  发布在  Spark
关注(0)|答案(0)|浏览(338)

我正在intellij中进行spark+scala+sbt项目设置。

Scala Version: 2.12.8
    SBT  Version:  1.4.2
    Java Version:  1.8

build.sbt文件:

name := "Spark_Scala_Sbt"
  version := "0.1"
   scalaVersion := "2.12.8"
    libraryDependencies ++= Seq(
   "org.apache.spark" %% "spark-core" % "2.3.3",
  "org.apache.spark" %% "spark-sql" % "2.3.3"
)

scala文件:

import org.apache.spark.sql.SparkSession

object FirstSparkApplication extends App {
  val spark = SparkSession.builder
    .master("local[*]")
    .appName("Sample App")
    .getOrCreate()
  val data = spark.sparkContext.parallelize(
Seq("I like Spark", "Spark is awesome", "My first Spark job is working now and is counting down these words")
  )
  val filtered = data.filter(line => line.contains("awesome"))
  filtered.collect().foreach(print)
}

但它会显示以下错误消息:

1. Cannot resolve symbol apache.
2. Cannot resolve symbol SparkSession
3. Cannot resolve symbol sparkContext
4. Cannot resolve symbol filter.
5. Cannot resolve symbol collect.
6. Cannot resolve symbol contains.

我应该在这里换什么?

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题