java.lang.noclassdeffounderror:org/apache/hbase/thirdparty/com/google/common/cache/cacheloader

q3qa4bjr  于 2021-06-09  发布在  Hbase
关注(0)|答案(0)|浏览(585)

我使用pyspark从hbase读取配置单元外部表,并且成功地创建了表,但是当我使用pyspark读取配置单元时,出现了以下错误:

spark.sql("use mydatabase")
user_rdd_list = spark.sql("select user_id, user_profile from ex_tbl limit 1")
``` `Py4JJavaError: An error occurred while calling o124.showString. :java.lang.NoClassDefFoundError:org/apache/hbase/thirdparty/com/google/common/cache/CacheLoader` 所有配置为:

[('spark.master', 'local'), ('spark.app.id', 'local-1571631446655'), ('spark.executor.memory', '2g'), ('spark.executor.id', 'driver'), ('spark.executor.cores', '2'), ('spark.app.name', 'RealTimeRecommendation'), ('spark.driver.host', 'iZ2ze85uv4ktko46vm8juvZ'), ('spark.sql.warehouse.dir', '/user/hive/warehouse'), ('spark.sql.catalogImplementation', 'hive'), ('spark.rdd.compress', 'True'), ('spark.executor.instances', '2'), ('spark.serializer.objectStreamReset', '100'), ('spark.submit.deployMode', 'client'), ('spark.driver.port', '33103'), ('spark.ui.showConsoleProgress', 'true')]

我已将以下jar添加到我的spark\u home/jars:

hbase-protocol-2.0.5.jar
hbase-client-2.0.5.jar
hbase-common-2.0.5.jar
hbase-server-2.0.5.jar
hive-hbase-handler-2.3.5.jar
metrics-core-3.1.5.jar
metrics-core-3.2.1.jar
guava-11.0.2.jar
guava-14.0.1.jar

我的spark版本是2.4.3,怎么处理?

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题