sparkr-objectstore:获取数据库全局\u temp失败,返回nosuchobjectexception

ukdjmx9f  于 2021-06-26  发布在  Hive
关注(0)|答案(1)|浏览(474)

在rstudio中尝试使用sparkr连接到spark群集时:

if (nchar(Sys.getenv("SPARK_HOME")) < 1) {
  Sys.setenv(SPARK_HOME = "/usr/lib/spark/spark-2.1.1-bin-hadoop2.6")
  .libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
}

library(SparkR, lib.loc = c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib")))

# Starting a sparkR session

sparkR.session(master = "spark://myIpAddress.eu-west-1.compute.internal:7077")

我收到以下错误消息:

Spark package found in SPARK_HOME: /usr/lib/spark/spark-2.1.1-bin-hadoop2.6
Launching java with spark-submit command /usr/lib/spark/spark-2.1.1-bin-hadoop2.6/bin/spark-submit   sparkr-shell /tmp/RtmpMWFrt6/backend_port71e6731ea922 
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/05/24 16:17:32 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/05/24 16:17:37 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
Java ref type org.apache.spark.sql.SparkSession id 1

在spark master中,我看到sparkr应用程序正在运行,但没有可用的sc变量。感觉这个错误可能与metastore有关,但不确定。有人知道是什么阻止了我的spark会话的正确启动吗?
谢谢,迈克尔

snz8szmq

snz8szmq1#

1-使用sudo rm-r/etc/spar/conf/hive.xml删除链接文件2-再次使用sudo ln-s/etc/hive/conf/hive-site.xml/etc/spark/conf/hive-site.xml链接文件

相关问题