从scala shell 启动Spark

nqwrtyyt  于 2022-12-13  发布在  Scala
关注(0)|答案(1)|浏览(145)

如何从sbt shell启动spark?我不想使用spark-shell命令。我想使用spark并使用sbt项目中的对象。

ss2ws0br

ss2ws0br1#

  • 将Spark依赖项添加到build.sbt
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.1",
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.1.1",
  • 运行sbt控制台:

sbt console

  • 加载Spark会话/上下文:
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.functions._
val spark = SparkSession.builder().master("local").appName("spark-shell").getOrCreate()
import spark.implicits._
val sc = spark.sparkContext

或者使用别名自动执行下一个命令:

initialCommands in console := s"""
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.functions._
val spark = SparkSession.builder().master("local").appName("spark-shell").getOrCreate()
import spark.implicits._
val sc = spark.sparkContext
"""

相关问题