从spark访问openstack swift-swiftauthenticationfailedexception

uxh89sit  于 2021-05-29  发布在  Hadoop
关注(0)|答案(1)|浏览(468)

我试图从spark 2.4访问openstack swift,但出现了一个错误。

org.apache.hadoop.fs.swift.exceptions.SwiftAuthenticationFailedException: Authenticate as tenant '78axxxxxxxxxxxxxxxxxxxxxxxxxxxx' PasswordCredentials{username='xxxxxxxxxxxx'}

sc.hadoopConfiguration.set(s"fs.swift.service.ovh.auth.url", "https://auth.cloud.ovh.net/v3/")
sc.hadoopConfiguration.set(s"fs.swift.service.ovh.tenant", "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx")
sc.hadoopConfiguration.set(s"fs.swift.service.ovh.username", "xxxxxxxxxxxx")
sc.hadoopConfiguration.set(s"fs.swift.service.ovh.password", "xxxxxxxxxxxxxxxxxxxx")
sc.hadoopConfiguration.set(s"fs.swift.service.ovh.http.port", "8080")
sc.hadoopConfiguration.set(s"fs.swift.service.ovh.region", "BHS3")
sc.hadoopConfiguration.set(s"fs.swift.service.ovh.public", "false")

我相信这些凭证是正确的,因为它们直接来自openstack rc文件,在使用python-swiftclient时,我可以很好地使用它们。我也尝试过使用v2.0端点,但没有成功。
不幸的是,我总是得到这个非常一般的错误消息,它不会告诉我哪个部分失败了。有没有更好的调试方法?

bxfogqkk

bxfogqkk1#

我使用了下面的例子,我从ovh spark提交团队得到了这个例子。一个重要的注意事项是使用租户名称,而不是openstack.rc文件中的租户id。

val hadoopConf = spark.sparkContext.hadoopConfiguration

hadoopConf.set("fs.swift.impl","org.apache.hadoop.fs.swift.snative.SwiftNativeFileSystem")
hadoopConf.set("fs.swift.service.auth.endpoint.prefix","/AUTH_")
hadoopConf.set("fs.swift.service.abc.http.port","443")
hadoopConf.set("fs.swift.service.abc.auth.url","https://auth.cloud.ovh.net/v2.0/tokens")
hadoopConf.set("fs.swift.service.abc.tenant","<TENANT NAME> or <PROJECT NAME>")
hadoopConf.set("fs.swift.service.abc.region","<REGION NAME>")
hadoopConf.set("fs.swift.service.abc.useApikey","false")
hadoopConf.set("fs.swift.service.abc.username","<USER NAME>")
hadoopConf.set("fs.swift.service.abc.password","<PASSWORD>")

https://github.com/mojtabaimani/spark-wordcount-swift-scala/blob/master/src/main/scala/com/ovh/example/sparkscalaapp.scala

相关问题