这个问题在这里已经有答案了:
解决apachespark中的依赖性问题(7个答案)
两年前关门了。
我有下面的scala代码,正在使用sbt编译和运行这个。sbt运行正常。
import org.apache.spark.SparkConf
import org.apache.spark.streaming.{StreamingContext, Seconds}
import com.couchbase.spark.streaming._
object StreamingExample {
def main(args: Array[String]): Unit = {
// Create the Spark Config and instruct to use the travel-sample bucket
// with no password.
val conf = new SparkConf()
.setMaster("local[*]")
.setAppName("StreamingExample")
.set("com.couchbase.bucket.travel-sample", "")
// Initialize StreamingContext with a Batch interval of 5 seconds
val ssc = new StreamingContext(conf, Seconds(5))
// Consume the DCP Stream from the beginning and never stop.
// This counts the messages per interval and prints their count.
ssc
.couchbaseStream(from = FromBeginning, to = ToInfinity)
.foreachRDD(rdd => {
rdd.foreach(message => {
//println(message.getClass());
message.getClass();
if(message.isInstanceOf[Mutation]) {
val document = message.asInstanceOf[Mutation].key.map(_.toChar).mkString
println("mutated: " + document);
} else if( message.isInstanceOf[Deletion]) {
val document = message.asInstanceOf[Deletion].key.map(_.toChar).mkString
println("deleted: " + document);
}
})
})
// Start the Stream and await termination
ssc.start()
ssc.awaitTermination()
}
}
但当作为如下spark作业运行时,这将失败:spark submit--class“streamingexample”--master“local[*]”target/scala-2.11/spark-samples\u 2.11-1.0.jar
错误是java.lang.nosuchmethoderror:com.couchbase.spark.streaming.mutation.key()
以下是my build.sbt
lazy val root = (project in file(".")).
settings(
name := "spark-samples",
version := "1.0",
scalaVersion := "2.11.12",
mainClass in Compile := Some("StreamingExample")
)
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.4.0",
"org.apache.spark" %% "spark-streaming" % "2.4.0",
"org.apache.spark" %% "spark-sql" % "2.4.0",
"com.couchbase.client" %% "spark-connector" % "2.2.0"
)
// META-INF discarding
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs @ _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
在我的机器上运行的spark版本是2.4.0,使用scala2.11.12。
观察:
我在spark jar(/usr/local/cellar/apache spark/2.4.0/libexec/jars)中没有看到com.couchbase.client\u spark-connector\u 2.11-2.2.0,但是存在旧版本的com.couchbase.client\u spark-connector\u 2.10-1.2.0.jar。
为什么spark submit不起作用?
sbt是如何运行的?它在哪里下载依赖项?
1条答案
按热度按时间wtzytmuj1#
请确保sbt使用的scala版本和spark连接器库版本与spark安装相同。
当我试图在我的系统上运行一个示例flink作业时,我遇到了类似的问题。它是由版本不匹配引起的。