无法在Spark中保存到Cassandra

kg7wmglp  于 2022-11-05  发布在  Cassandra
关注(0)|答案(2)|浏览(221)

我想把文件从hdfs和保存到cassandra

import org.apache.spark.{SparkConf, SparkContext}
import com.datastax.spark.connector._
val conf = new SparkConf().setMaster("local[2]).setAppName("test")
.set("spark.cassandra.connection.host", "192.168.0.1")
val sc = new SparkContext(conf)
val files = sc.textFiles("hdfs://192.168.0.1:9000/test/", 1)
files.map(_.split("\n")).saveToCassandra("ks", "tb", SomeColumns("id", "time", "text"))
sc.stop()

但我不能写它cassandra因为例外
文件,因为files.foreach(x =〉println(x))可以正常工作

gcxthw6b

gcxthw6b1#

据我所知,您可以进行以下更改

var wc1 = files.map(_.split("\\|")).map(r=>Row(r(0),r(1),r(2)))
implicit val rowWriter = SqlRowWriter.Factory
wc1.saveToCassandra("ks", "tb", SomeColumns("id", "time", "text"))

"希望能成功“

fwzugrvs

fwzugrvs2#

一个月一个月一个月一个月

str += "\n" + x.nextInt().abs.toString() + " " + java.time.LocalDateTime.now() + " " + x.alphanumeric.take(20).mkString 
  }
  else str += x.nextInt().abs.toString() + " " + java.time.LocalDateTime.now() + " " + x.alphanumeric.take(20).mkString + "\n"
}
str

}
} i以这种方式写入文件

相关问题