我正在开发一个spark应用程序,它需要是一个胖jar,它应该在本地模式下运行,没有安装集群或hive示例,只有jar。
如果我用 sbt run
它的工作如预期,然而,脂肪罐与 sbt assembly
返回配置单元无法示例化的异常 HiveMetaStoreClient
.
复制:
main.scala公司
import org.apache.spark.sql.SparkSession
object Main extends App {
var spark =
SparkSession.builder
.appName("logiQ")
.config("spark.master", "local[2]")
.enableHiveSupport
.getOrCreate
System.setSecurityManager(null)
spark.sql("create database test")
spark.sql("show databases").show
}
构建.sbt
name := "test"
scalaVersion := "2.12.11"
val sparkVersion = "2.4.6"
libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion
libraryDependencies += "org.apache.spark" %% "spark-hive" % sparkVersion
assemblyJarName in assembly := s"${name.value}.jar"
assemblyMergeStrategy in assembly := {
case PathList("META-INF", "MANIFEST.MF") => MergeStrategy.discard
case PathList("META-INF", xs @ _*) => MergeStrategy.filterDistinctLines
case "plugin.xml" => MergeStrategy.last
case _ => MergeStrategy.first
}
这怎么可能,胖jar不就是一个包含所有依赖项的jar吗,为什么 sbt run
作品?我错过了什么隐藏的魔法?
暂无答案!
目前还没有任何答案,快来回答吧!