spark jdbc写入salesforce错误

fv2wmkja  于 2021-06-26  发布在  Hive
关注(0)|答案(0)|浏览(262)

我正在尝试从配置单元读取数据,并使用progress中salesforce的jdbc驱动器写入salesforce中的自定义对象。下面是我如何做到这一点

spark-shell --jars /usr/hdp/current/spark-client/lib/sforce.jar
import org.apache.spark.sql.hive._
val hc = new HiveContext(sc)
val results = hc.sql("select rep_name FROM schema.rpt_view")
print(results.first())
import org.apache.spark.sql.SaveMode
val url="jdbc:datadirect:sforce://login.salesforce.com"
val prop = new java.util.Properties
prop.put("user","user1")
prop.put("password","passwd")
prop.put("driver","com.ddtek.jdbc.sforce.SForceDriver")
results.write.mode(SaveMode.Append).jdbc(url,"SFORCE.test_tab1",prop)`

我得到了错误

`java.sql.SQLSyntaxErrorException: [DataDirect][SForce JDBC Driver][SForce]column size is required in statement [CREATE TABLE SFORCE.test_tab1 (rep_name TEXT`

有人能帮我一下吗。。如果表test\u tab1已经存在,如何配置写操作?如果salesforce中不存在表,如何添加列值

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题