spark submit在使用配置单元表时抛出错误

kcrjzv8t  于 2021-06-29  发布在  Hive
关注(0)|答案(1)|浏览(494)

我有一个奇怪的错误,我正在尝试向配置单元写入数据,它在sparkshell中运行良好,但是当我使用sparksubmit时,它抛出了默认错误中找不到的数据库/表。
下面是我试图在spark submit中编写的代码,我使用的是spark 2.0.0的定制版本

val sqlContext = new org.apache.spark.sql.SQLContext(sc)
sqlContext.table("spark_schema.iris_ori")

下面是我使用的命令,

/home/ec2-user/Spark_Source_Code/spark/bin/spark-submit --class TreeClassifiersModels --master local[*] /home/ec2-user/Spark_Snapshots/Spark_2.6/TreeClassifiersModels/target/scala-2.11/treeclassifiersmodels_2.11-1.0.3.jar /user/ec2-user/Input_Files/defPath/iris_spark SPECIES~LBL+PETAL_LENGTH+PETAL_WIDTH RAN_FOREST 0.7 123 12

下面是错误,
16/05/20 09:05:18 info sparksqlparser:解析命令:spark\u schema.measures\u 20160520090502线程“main”org.apache.spark.sql.analysisexception异常:数据库“spark\u schema”不存在;在org.apache.spark.sql.catalyst.catalog.externalcatalog.requiredbexists(externalcatalog。scala:37)在org.apache.spark.sql.catalyst.catalog.inmemorycatalog.tableexists(inmemorycatalog。scala:195)位于org.apache.spark.sql.catalyst.catalog.sessioncatalog.tableexists(sessioncatalog。scala:360)在org.apache.spark.sql.dataframewriter.saveastable(dataframewriter。scala:464)在org.apache.spark.sql.dataframewriter.saveastable(dataframewriter。scala:458)在treeclassifiersmodels$.main(treeclassifiersmodels。scala:71)在sun.reflect.nativemethodaccessorimpl.invoke0(本机方法)的treeclassifiersmodels.main(treeclassifiersmodels.scala)中sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl。java:62)在sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl。java:43)在java.lang.reflect.method.invoke(方法。java:497)在org.apache.spark.deploy.sparksubmit$.org$apache$spark$deploy$sparksubmit$$runmain(sparksubmit)。scala:726)在org.apache.spark.deploy.sparksubmit$.dorunmain$1(sparksubmit。scala:183)在org.apache.spark.deploy.sparksubmit$.submit(sparksubmit。scala:208)位于org.apache.spark.deploy.sparksubmit$.main(sparksubmit.com)。scala:122)位于org.apache.spark.deploy.sparksubmit.main(sparksubmit.scala)

cczfrluj

cczfrluj1#

这个问题是因为sparkversion2.0.0上发生了弃用。spark 2.0.0中不推荐使用配置单元上下文。要在spark 2.0.0上读/写配置单元表,我们需要使用如下spark会话。

val sparkSession = SparkSession.withHiveSupport(sc)

相关问题