未找到启动thrift服务器错误文件:/tmp/hive/

elcex8rz  于 2021-05-29  发布在  Hadoop
关注(0)|答案(0)|浏览(233)

我可以访问一个群集,当我们尝试使用此命令启动thrift server时,该群集出现以下错误: dse -u my_usser -p my_password spark-sql-thriftserver start 运行该命令后,将显示以下消息: # starting org.apache.spark.sql.hive.thriftserver.HiveThriftServer2, logging to /root/spark-thrift-server/spark-root-org.apache.spark.sql.hive.thriftserver.HiveThriftServer2-1-node3.domain.com.out 当我显示该文件的输出时,错误是:
warn 2017-12-29 01:44:01875 org.apache.spark.sparkcontext:使用现有的sparkcontext,某些配置可能不会生效。错误2017-12-29 01:44:05,379 org.apache.spark.deploy.dsesparksubmitbootstrapper:未能启动或提交spark应用程序java.lang.runtimeexception:com.datastax.bdp.fs.model.nosuchfileexception:找不到文件:/tmp/hive/at org.apache.hadoop.hive.ql.session.sessionstate.start(sessionstate)。java:522)~[hive-exec-1.2.1.spark2。jar:1.2.1.spark2]在org.apache.spark.sql.hive.client.hiveclientimpl.(hiveclientimpl。scala:189)~[spark-hive\ 2.11-2.0.2.6-de611f9。jar:2.0.2.6-de611f9]在sun.reflect.nativeconstructoraccessorimpl.newinstance0(本机方法)~[na:1.8.0\u 151]在sun.reflect.nativeconstructoraccessorimpl.newinstance(nativeconstructoraccessorimpl)。java:62)~[na:1.8.0_]在sun.reflect.delegatingconstructoraccessorimpl.newinstance(delegatingconstructoraccessorimpl。java:45)~[na:1.8.0\u 151]位于java.lang.reflect.constructor.newinstance(constructor。java:423)~[na:1.8.0_]位于org.apache.spark.sql.hive.client.isolatedclientloader.createclient(isolatedclientloader)。scala:258) ~[spark-hive\ 2.11-2.0.2.6-de611f9。jar:2.0.2.6-de611f9]在org.apache.spark.sql.hive.hiveutils$.newclientformetadata(hiveutils。scala:359)~[spark-hive\ 2.11-2.0.2.6-de611f9。jar:2.0.2.6-de611f9]在org.apache.spark.sql.hive.hiveutils$.newclientformetadata(hiveutils。scala:263)~[spark-hive\ 2.11-2.0.2.6-de611f9。jar:2.0.2.6-de611f9]在org.apache.spark.sql.hive.hivesharedstate.metadatahive$lzycompute(hivesharedstate)。scala:39)~[spark-hive\ 2.11-2.0.2.6-de611f9。jar:2.0.2.6-de611f9]位于org.apache.spark.sql.hive.hivesharedstate.metadatahive(hivesharedstate)。scala:38)~[spark-hive\ 2.11-2.0.2.6-de611f9。jar:2.0.2.6-de611f9]在org.apache.spark.sql.hive.hivesessionstate.metadatahive$lzycompute(hivesessionstate)。scala:43)~[spark-hive\ 2.11-2.0.2.6-de611f9。jar:2.0.2.6-de611f9]位于org.apache.spark.sql.hive.hivesessionstate.metadatahive(hivesessionstate)。scala:43)~[spark-hive\ 2.11-2.0.2.6-de611f9。jar:2.0.2.6-de611f9]在org.apache.spark.sql.hive.thriftserver.sparksqlenv$.init(sparksqlenv。scala:62)~[spark-hive-thriftserver\ 2.11-2.0.2.6-de611f9。jar:2.0.2.6-de611f9]位于org.apache.spark.sql.hive.thriftserver.hivethriftserver2$.main(hivethriftserver2。scala:81)~[spark-hive-thriftserver\ 2.11-2.0.2.6-de611f9。jar:2.0.2.6-de611f9]
我已手动创建了该目录,但该目录不起作用:(

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题