如何修复sbt项目中的“原点位置必须是绝对的”错误(使用spark 2.4.5和deltalake 0.6.1)?

qco9c6ql  于 2021-05-29  发布在  Spark
关注(0)|答案(1)|浏览(399)

我正在尝试用Deltalake0.6.1为spark 2.4.5设置一个sbt项目。我的构建文件如下。
不过,这种配置似乎无法解决某些依赖关系。

[info] Reapplying settings...
[info] Set current project to red-basket-pipelnes (in build file:/Users/ashika.umagiliya/git-repo/redbasket-pipelines/red-basket-pipelnes/)
[info] Updating ...
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.antlr#antlr4;4.7: org.antlr#antlr4;4.7!antlr4.pom(pom.original) origin location must be absolute: file:/Users/ashika.umagiliya/.m2/repository/org/antlr/antlr4/4.7/antlr4-4.7.pom
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]
[warn]  Note: Unresolved dependencies path:
[warn]      org.antlr:antlr4:4.7
[warn]        +- io.delta:delta-core_2.11:0.6.1 (/Users/ashika.umagiliya/git-repo/redbasket-pipelines/red-basket-pipelnes/build.sbt#L13-26)
[warn]        +- com.mycompany.dpd.solutions:deltalake-pipelnes_2.11:1.0
[error] sbt.librarymanagement.ResolveException: unresolved dependency: org.antlr#antlr4;4.7: org.antlr#antlr4;4.7!antlr4.pom(pom.original) origin location must be absolute: file:/Users/ashika.umagiliya/.m2/repository/org/antlr/antlr4/4.7/antlr4-4.7.pom
[error]     at sbt.internal.librarymanagement.IvyActions$.resolveAndRetrieve(IvyActions.scala:332)

构建.sbt

name := "deltalake-pipelnes"
version := "1.0"
organization := "com.mycompany.dpd.solutions"

// The compatible Scala version for Spark 2.4.1 is 2.11
scalaVersion := "2.11.12"

val sparkVersion = "2.4.5"
val scalatestVersion = "3.0.5"
val deltaLakeCore = "0.6.1"
val sparkTestingBaseVersion = s"${sparkVersion}_0.14.0"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion % "provided",
  "org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
  "org.apache.spark" %% "spark-hive" % sparkVersion % "provided",
  "org.apache.spark" %% "spark-avro" % sparkVersion % "provided",

  "io.delta" %% "delta-core" % deltaLakeCore,

  "org.scalatest" %% "scalatest" % scalatestVersion % "test",
  "com.holdenkarau" %% "spark-testing-base" % sparkTestingBaseVersion % "test"
)

assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)

assemblyMergeStrategy in assembly := {
  case PathList("org", "apache", xs @ _*) => MergeStrategy.last
  case PathList("changelog.txt") => MergeStrategy.last
  case PathList(ps @ _*) if ps.last contains "spring" => MergeStrategy.last

  case x =>
    val oldStrategy = (assemblyMergeStrategy in assembly).value
    oldStrategy(x)
}

resolvers ++= Seq(
  "SPDB Maven Repository" at "https://artifactory.mycompany-it.com/spdb-mvn/",
  Resolver.mavenLocal)

publishMavenStyle := true
publishTo := {
  val repoBaseUrl = "https://artifactory.mycompany-it.com/"
  if (isSnapshot.value)
    Some("snapshots" at repoBaseUrl + "spdb-mvn-snapshot/")
  else
    Some("releases"  at repoBaseUrl + "spdb-mvn-release/")
}
publishConfiguration := publishConfiguration.value.withOverwrite(true)
publishLocalConfiguration := publishLocalConfiguration.value.withOverwrite(true)
credentials += Credentials(Path.userHome / ".sbt" / ".credentials")

artifact in (Compile, assembly) := {
  val art = (artifact in (Compile, assembly)).value
  art.withClassifier(Some("assembly"))
}

addArtifact(artifact in (Compile, assembly), assembly)

parallelExecution in Test := false

关于如何解决这个问题有什么建议吗?

8e2ybdfx

8e2ybdfx1#

我自己还没有弄清楚这是什么时候发生的,为什么会发生,但我之前确实遇到过类似的解决相关的错误。
每当我遇到像你这样的问题,我通常会删除受影响的目录(例如。 /Users/ashika.umagiliya/.m2/repository/org/antlr )重新开始。这通常是有帮助的。
我总是确保使用最新和最好的sbt。你好像在玩macos所以用吧 brew update 早而常。
我还建议使用最新和最好的库,更具体地说,对于spark,它应该是2.4.6(在2.4.x行中),而delta lake应该是0.7.0。

相关问题