示例化“org.apache.spark.sql.hive.hiveexternalcatalog”时出错

63lcw9qa  于 2021-06-25  发布在  Hive
关注(0)|答案(1)|浏览(576)

我无法从pyspark运行配置单元查询。
我试图将hive-site.xml复制到spark的conf中,但尽管如此,它还是抛出了相同的错误
完全错误

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/spark-2.4.0/python/pyspark/sql/context.py", line 358, in sql
    return self.sparkSession.sql(sqlQuery)
  File "/usr/local/spark-2.4.0/python/pyspark/sql/session.py", line 767, in sql
    return DataFrame(self._jsparkSession.sql(sqlQuery), self._wrapped)
  File "/usr/local/spark-2.4.0/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
  File "/usr/local/spark-2.4.0/python/pyspark/sql/utils.py", line 79, in deco
    raise IllegalArgumentException(s.split(': ', 1)[1], stackTrace)
pyspark.sql.utils.IllegalArgumentException: u"Error while instantiating 'org.apache.spark.sql.hive.HiveExternalCatalog':"
a11xaf1n

a11xaf1n1#

在我和oozie的测试中,我不得不添加spark需要的与Hive相关的jar。试着在spark的conf中添加相同的

相关问题