带有sbt + cassandra连接器依赖性问题的Spark流

djp7away  于 2022-11-05  发布在  Cassandra
关注(0)|答案(1)|浏览(163)

各位,
我正在尝试整合cassandra与Spark流。下面是sbt文件:

scalaVersion := "2.11.8"

libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-streaming" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-sql" % "1.6.1",
"com.datastax.spark" %% "spark-cassandra-connector" % "1.6.2",
"com.datastax.cassandra" % "cassandra-driver-core" % "3.0.0",
("org.apache.spark" %% "spark-streaming-kafka" % "1.6.0").
exclude("org.spark-project.spark", "unused")
)

我为cassandra积分添加了下面的行(下面提到的错误行):

val lines = KafkaUtils.createDirectStream[
String, String, StringDecoder, StringDecoder](
ssc, kafkaParams, topics)

//Getting errors once I add below line in program 
lines.saveToCassandra("test", "test", SomeColumns("key", "value"))

lines.print()

添加以上行后,我在IDE中看到以下错误:

如果我尝试从命令提示符打包此项目,我会看到类似错误:

FYR,我使用的是以下版本:

斯卡拉- 2.11版
Kafka-Kafka_2.11 -0.8.2.1
Java- 8个版本
Cassandra-数据协议-社区-64位_2.2.8

请帮助解决此问题。

t3psigkw

t3psigkw1#

正如预期,这是依赖性问题,通过更新sbt文件解决,如下所示:

scalaVersion := "2.11.8"

libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-streaming" % "2.0.0" % "provided",
"org.apache.spark" %% "spark-sql" % "2.0.0",
"com.datastax.spark" %% "spark-cassandra-connector" % "2.0.0-RC1",
"com.datastax.cassandra" % "cassandra-driver-core" % "3.0.0",
("org.apache.spark" %% "spark-streaming-kafka" % "1.6.0").
exclude("org.spark-project.spark", "unused")
)

相关问题