spark sbt编译错误库依赖项

eblbsuwk  于 2021-05-29  发布在  Hadoop
关注(0)|答案(3)|浏览(502)

1.2.0-bin-hadoop2.4 我的scala版本是 2.11.7 . 我得到一个错误,所以我不能使用sbt。

~/sparksample$ sbt
``` `Starting sbt: invoke with -help for other options [info] Set current project to Spark Sample (in build file:/home/beyhan/sparksample/)` ```
> sbt compile
``` `[info] Updating {file:/home/beyhan/sparksample/}default-f390c8... [info] Resolving org.scala-lang#scala-library;2.11.7 ... [info] Resolving org.apache.spark#spark-core_2.11.7;1.2.0 ... [warn] module not found: org.apache.spark#spark-core_2.11.7;1.2.0 [warn] ==== local: tried [warn] /home/beyhan/.ivy2/local/org.apache.spark/spark-core_2.11.7/1.2.0/ivys/ivy.xml [warn] ==== public: tried [warn] http://repo1.maven.org/maven2/org/apache/spark/spark-core_2.11.7/1.2.0/spark-core_2.11.7-1.2.0.pom [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: UNRESOLVED DEPENDENCIES :: [warn] :::::::::::::::::::::::::::::::::::::::::::::: [warn] :: org.apache.spark#spark-core_2.11.7;1.2.0: not found [warn] :::::::::::::::::::::::::::::::::::::::::::::: [error] {file:/home/beyhan/sparksample/}default-f390c8/*:update: sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.11.7;1.2.0: not found [error] Total time: 2 s, completed Oct 15, 2015 11:30:47 AM` 有什么建议吗?谢谢
y0u0uwnf

y0u0uwnf1#

不存在 spark-core_2.11.7 jar文件。你必须去掉维护版本号 .7 因为 spark-core_2.11 存在。所有scala版本 2.11 应该是兼容的。

更新

一个最小的sbt文件

name := "Simple Project"

version := "1.0"

scalaVersion := "2.11.7"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.1"
alen0pnh

alen0pnh2#

[信息]正在更新{file:/home/beyhan/sparksample/}default-f390c8[信息]解析org.scala lang#scala库;2.11.7 ... [信息]解析org.apache.spark#spark-core_.11.7;1.2.0 ... [警告]找不到模块:org.apache.spark#spark-core_.11.7;1.2.0[warn]===本地:已尝试[warn]/home/beyhan/.ivy2/local/org.apache.spark/spark-core\u 2.11.7/1.2.0/ivys/ivy.xml[warn]==公共:已尝试[warn]

noj0wjuj

noj0wjuj3#

就像rohrmann建议你的那样 spark-core_2.11.7 而build.sbt似乎引用了该库。
我建议你编辑这个文件 /home/beyhan/sparksample/build.sbt 并删除对该库的引用。
正确的参考是:

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.2.0"

记住不仅 spark-core 没有任何版本 2.11.7 还有你可能正在使用的其他spark库。

相关问题