带有assemblyexcludedjars的sbt程序集不工作

gcmastyq  于 2021-05-27  发布在  Spark
关注(0)|答案(0)|浏览(304)

我在试着用 sbt assembly 命令为spark 3.0创建胖jar。在我使用 provided 关键字以在使用assembly命令时排除某些jar。那么我只想在使用 sbt assembly 但当我在运行intellij的想法时就不行了。我想我可以用 assemblyExcludedJars 关键字,基于此答案。然而,它似乎不起作用。它并没有排除我想要的jar,在使用 sbt assembly 带和不带 provided 关键字。

name := "explore-spark"
version := "0.2"
scalaVersion := "2.12.7"
val sparkVersion = "3.0.0"

// I have to use "provided" since the assemblyExcludedJars is not working
libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-streaming" % sparkVersion, // % "provided",
  "org.apache.spark" %% "spark-sql" % sparkVersion, // % "provided",
  "org.apache.spark" %% "spark-streaming-kafka-0-10" % sparkVersion, // % "provided",
  "io.dropwizard.metrics" % "metrics-core" % "4.1.11" % "provided",
  "com.twitter" %% "algebird-core" % "0.13.7",
  "joda-time" % "joda-time" % "2.5",
  "org.fusesource.mqtt-client" % "mqtt-client" % "1.16"
)

assemblyExcludedJars in assembly := {
  val cp = (fullClasspath in assembly).value
  cp filter { f =>
    f.data.getName.contains("spark") ||
    f.data.getName == "spark-streaming_2.12-3.0.0.jar" ||
    f.data.getName == "spark-sql_2.12-3.0.0.jar" ||
    f.data.getName == "spark-streaming-kafka-0-10_2.12-3.0.0.jar"
  }
}

mainClass in(Compile, packageBin) := Some("org.sense.spark.app.App")
mainClass in assembly := Some("org.sense.spark.app.App")

assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
assemblyJarName in assembly := s"${name.value}_${scalaBinaryVersion.value}-fat_${version.value}.jar"

我的 assembly.sbt 文件包含以下行:

addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.15.0")

resolvers += Resolver.bintrayIvyRepo("com.eed3si9n", "sbt-plugins")

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题