如何从spark将区域/dc感知连接到cassandra?

piok6c0g  于 2021-06-14  发布在  Cassandra
关注(0)|答案(1)|浏览(373)

我使用的是sparksql2.4.1、spark-cassandra-connector_2.11-2.4.1.jar和java8。当我试图从表中获取数据时,我遇到了

java.io.IOException: Failed to write statements to keyspace1.model_vals. The
latest exception was
  An unexpected error occurred server side on cassandra-node1: com.google.common.util.concurrent.UncheckedExecutionException: com.google.common.util.concurrent.UncheckedExecutionException: java.lang.RuntimeException: org.apache.cassandra.exceptions.ReadTimeoutException: Operation timed out - received only 0 responses.

那么,如何从spark代码将区域/dc感知连接到cassandra db?
基督教青年会
现有的

spring:
  data:
      cassandra:
        keyspace-name: raproduct
        contact-points:
                    - cassandra-node1
                    - cassandra-node2
        port: 9042

更改为

spring:
  data:
      cassandra:
        connection:
          local_dc: southeast-1
        keyspace-name: raproduct
        contact-points:
                    - cassandra-node1
                    - cassandra-node2
        port: 9042

问题
但它并没有反映/应用更改后的“本地\u dc”。如何在spring数据中实现?

g6ll5ycj

g6ll5ycj1#

检查Spark接头文档和配置参考-cassandra连接参数。似乎可以通过设置 spark.cassandra.connection.local_dc 连接配置中的属性:

val conf = new SparkConf(true)
        .set("spark.cassandra.connection.host", "192.168.1.10")
        .set("spark.cassandra.auth.username", "flynn")            
        .set("spark.cassandra.auth.password", "reindeerFlotilla82")
        .set("spark.cassandra.connection.local_dc", "encom_west1_dc")

val sc = new SparkContext("spark://192.168.1.133:7077", "test", conf)

不确定您的连接配置代码是什么样子的,但请尝试设置它 spark.cassandra.connection.local_dc 财产,看看你能得到什么。

相关问题