oozie spark使用kerberos访问配置单元

mnemlml8  于 2021-06-26  发布在  Hive
关注(0)|答案(1)|浏览(497)

当我在oozie中执行spark进程时,我有以下错误。找不到数据库。

2018-09-26 15:27:23,576 INFO [main] org.apache.spark.deploy.yarn.Client: 
     client token: Token { kind: YARN_CLIENT_TOKEN, service:  }
     diagnostics: User class threw exception: org.apache.spark.sql.catalyst.analysis.NoSuchDatabaseException: Database 'zdm_datos_externos' not found;
     ApplicationMaster host: 10.74.234.6
     ApplicationMaster RPC port: 0
     queue: default
     queue user: administrador
     start time: 1537986410475
     final status: FAILED
     tracking URL: https://BR-PC-CentOS-02:26001/proxy/application_1537467570666_4127/
     user: administrador



这是我的spark配置

String warehouseLocation = new File("spark-warehouse").getAbsolutePath();
    SparkSession spark = SparkSession
            .builder()
            .appName("Java Spark Hive Example")
            .master("yarn")
            .config("spark.sql.warehouse.dir", warehouseLocation)
            .config("spark.driver.maxResultSize", "3g")
            .config("spark.debug.maxToStringFields", "10000")
            .config("spark.sql.crossJoin.enabled", "true")
            .enableHiveSupport()
            .getOrCreate();
    spark.conf().set("spark.driver.maxResultSize", "3g");

metastore,当前连接:1 2018-09-26 17:31:42598 warn[main]hive.metastore:set_gi()未成功,可能原因:新客户端与旧服务器对话。没有它就继续。org.apache.thrift.transport.tttransportException位于org.apache.thrift.transport.tiostreamtransport.read(tiostreamtransport。java:132)在org.apache.thrift.transport.tttransport.readall(tttransport。java:86)在org.apache.thrift.protocol.tbinaryprotocol.readstringbody(tbinaryprotocol。java:380)在org.apache.thrift.protocol.tbinaryprotocol.readmessagebegin(tbinaryprotocol。java:230)位于org.apache.thrift.tserviceclient.receivebase(tserviceclient。java:77)位于org.apache.hadoop.hive.metastore.api.thrifthivemetastore$client.recv\u set\u ugi(thrifthivemetastore)。java:3748)在org.apache.hadoop.hive.metastore.api.thrifthivemetastore$client.set_gi(thrifthivemetastore)。java:3734)在org.apache.hadoop.hive.metastore.hivemetastoreclient.open(hivemetastoreclient。java:557)在org.apache.hadoop.hive.metastore.hivemetastoreclient.(hivemetastoreclient。java:249)在org.apache.hadoop.hive.ql.metadata.sessionhivemetastoreclient.(sessionhivemetastoreclient。java:74)在sun.reflect.nativeconstructoraccessorimpl.newinstance0(本机方法)在sun.reflect.nativeconstructoraccessorimpl.newinstance(nativeconstructoraccessorimpl)。java:62)在sun.reflect.delegatingconstructoraccessorimpl.newinstance(delegatingconstructoraccessorimpl。java:45)在java.lang.reflect.constructor.newinstance(constructor。java:423)位于org.apache.hadoop.hive.metastore.metastoreutils.newinstance(metastoreutils。java:1533)在org.apache.hadoop.hive.metastore.retryingmetastoreclient.(retryingmetastoreclient。java:86)位于org.apache.hadoop.hive.metastore.retryingmetastoreclient.getproxy(retryingmetastoreclient。java:132)位于org.apache.hadoop.hive.metastore.retryingmetastoreclient.getproxy(retryingmetastoreclient。java:104)在org.apache.hadoop.hive.ql.metadata.hive.createmetastoreclient(配置单元。java:3157)在org.apache.hadoop.hive.ql.metadata.hive.getmsc(hive。java:3176)在org.apache.hadoop.hive.ql.metadata.hive.getallfunctions(hive。java:3409)在org.apache.hadoop.hive.ql.metadata.hive.reloadfunctions(hive。java:178)在org.apache.hadoop.hive.ql.metadata.hive.(配置单元)。java:170)在sun.reflect.nativemethodaccessorimpl.invoke0(本机方法)在sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl)。java:62)在sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl。java:43)在java.lang.reflect.method.invoke(方法。java:498)在org.apache.spark.deploy.yarn.security.hivecredentialprovider$$anonfun$obtaincredentials$1.apply$mcv$sp(hivecredentialprovider)。scala:91)在org.apache.spark.deploy.yarn.security.hivecredentialprovider$$anonfun$obtaincredentials$1.apply(hivecredentialprovider)上。scala:90)在org.apache.spark.deploy.yarn.security.hivecredentialprovider$$anonfun$obtaincredentials$1.apply(hivecredentialprovider)。scala:90)在org.apache.spark.deploy.yarn.security.hivecredentialprovider$$anon$1.run(hivecredentialprovider。scala:124)位于java.security.accesscontroller.doprivileged(本机方法)javax.security.auth.subject.doas(主题。java:422)在org.apache.hadoop.security.usergroupinformation.doas(usergroupinformation。java:1778)位于org.apache.spark.deploy.yarn.security.hivecredentialprovider.doasrelauser(hivecredentialprovider)。scala:123)在org.apache.spark.deploy.yarn.security.hivecredentialprovider.obtaincredentials(hivecredentialprovider)。scala:90)在org.apache.spark.deploy.yarn.security.configurablecredentialmanager$$anonfun$obtaincredentials$2.apply(configurablecredentialmanager。scala:82)在org.apache.spark.deploy.yarn.security.configurablecredentialmanager$$anonfun$obtaincredentials$2.apply(configurablecredentialmanager)。scala:80)在scala.collection.traversablelike$$anonfun$flatmap$1.apply(traversablelike。scala:241)在scala.collection.traversablelike$$anonfun$flatmap$1.apply(traversablelike。scala:241)在scala.collection.iterator$class.foreach(迭代器。scala:893)在scala.collection.abstractiterator.foreach(迭代器。scala:1336)在scala.collection.maplike$defaultvaluesiterable.foreach(maplike。scala:206)在scala.collection.traversablelike$class.flatmap(traversablelike。scala:241)在scala.collection.abstracttraversable.flatmap(可遍历。scala:104)位于org.apache.spark.deploy.yarn.security.configurablecredentialmanager.obtaincredentials(configurablecredentialmanager)。scala:80)在org.apache.spark.deploy.yarn.client.preparelocalresources(client。scala:430)在org.apache.spark.deploy.yarn.client.createcontainerlaunchcontext(客户端。scala:915)在org.apache.spark.deploy.yarn.client.submitapplication(client。scala:195)在org.apache.spark.deploy.yarn.client.run(client。scala:1205)在org.apache.spark.deploy.yarn.client$.main(client。scala:1261)位于org.apache.spark.deploy.yarn.client.main(client.scala)sun.reflect.nativemethodaccessorimpl.invoke0(本机方法)位于sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl)。java:62)在sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl。java:43)在java.lang.reflect.method.invoke(方法。java:498)在org.apache.spark.deploy.sparksubmit$.org$apache$spark$deploy$sparksubmit$$runmain(sparksubmit)。scala:761)在org.apache.spark.deploy.sparksubmit$.dorunmain$1(sparksubmit。scala:190)在org.apache.spark.deploy.sparksubmit$.submit(sparksubmit。scala:215)位于org.apache.spark.deploy.sparksubmit$.main(sparksubmit.com)。scala:129)在org.apache.oozie.action.hadoop.sparkmain.runspark(sparkmain)上的org.apache.spark.deploy.sparksubmit.main(sparksubmit.scala)。java:113)在org.apache.oozie.action.hadoop.sparkmain.run(sparkmain。java:104)在org.apache.oozie.action.hadoop.launchemain.run(launchemain。java:47)在org.apache.oozie.action.hadoop.sparkmain.main(sparkmain。java:38)在sun.reflect.nativemethodaccessorimpl.invoke0(本机方法)位于sun.reflect.nativemethodaccessorimpl.invoke(nativemethodaccessorimpl)。java:62)在sun.reflect.delegatingmethodaccessorimpl.invoke(delegatingmethodaccessorimpl。java:43)在java.lang.reflect.method.invoke(方法。java:498)在org.apache.oozie.action.hadoop.launchermapper.map(launchermapper。java:238)在org.apache.hadoop.mapred.maprunner.run(maprunner。java:54)在org.apache.hadoop.mapred.maptask.runoldmapper(maptask。java:459)在org.apache.hadoop.mapred.maptask.run(maptask。java:343)在org.apache.hadoop.mapred.yarnchild$2.run(yarnchild。java:187)在javax.security.auth.subject.doas(主题)中的java.security.accesscontroller.doprivileged(本机方法)。java:422)

bkkx9g8r

bkkx9g8r1#

你用的是什么版本的spark?您在sparksession上启用了配置单元支持吗?

sparkBuilder.enableHiveSupport().appName(appName).getOrCreate()

相关问题