如何从sbt shell启动spark?我不想使用spark-shell命令。我想使用spark并使用sbt项目中的对象。
ss2ws0br1#
build.sbt
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.1", libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.1.1",
sbt console
import org.apache.spark.sql.SparkSession import org.apache.spark.sql.functions._ val spark = SparkSession.builder().master("local").appName("spark-shell").getOrCreate() import spark.implicits._ val sc = spark.sparkContext
或者使用别名自动执行下一个命令:
initialCommands in console := s""" import org.apache.spark.sql.SparkSession import org.apache.spark.sql.functions._ val spark = SparkSession.builder().master("local").appName("spark-shell").getOrCreate() import spark.implicits._ val sc = spark.sparkContext """
1条答案
按热度按时间ss2ws0br1#
build.sbt
:sbt console
或者使用别名自动执行下一个命令: