使用以下链接将数据从databricks加载到sql db,我得到以下错误: command-3227900948916301:23: error: value bulkCopyToSqlDB is not a member of org.apache.spark.sql.DataFrameWriter[org.apache.spark.sql.Row] df.write.mode(SaveMode.Overwrite).bulkCopyToSqlDB(bulkCopyConfig)
https://github.com/azure/azure-sqldb-spark/blob/fa1cf19ed797648a20d9b7f474d7c2cd88829ada/samples/scripts/bulkcopysample.scala
我的代码如下:
val bulkCopyConfig = Config(Map(
"url" -> url,
"databaseName" -> databaseName,
"dbTable" -> "dbo.xxxx",
"user" -> user,
"password" -> password,
"connectTimeout" -> "120",
"bulkCopyBatchSize" -> "100000",
"bulkCopyTableLock" -> "true",
"bulkCopyTimeout" -> "0",
"truncate" -> "true"
// "queryTimeout" -> "5"
))
//~
df. write.mode(SaveMode.Overwrite).bulkCopyToSqlDB(bulkCopyConfig)
Any thoughts on why I'm getting the error?
1条答案
按热度按时间pbpqsu0x1#
您需要有正确的导入,才能使用其他函数扩展Dataframe:
下面是我得到的(它失败了,因为我没有活动的sql db,但它找到了函数):