如何在oozie中运行带有配置单元操作的shell脚本?

pod7payv  于 2021-05-29  发布在  Hadoop
关注(0)|答案(1)|浏览(428)

我试着在oozie中每天运行带有Hive动作的shell脚本。我已经成功地在oozie的行动,但在shell脚本Hive部分没有工作。当我从shell运行脚本时,它运行得很好。位于hdfs中的文件。这是个例外

Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1422)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
    at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2457)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2469)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:341)
    ... 7 more
Caused by: java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1420)
    ... 12 more
Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: GSS initiate failed
    at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:221)
    at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:297)
    at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
    at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
    at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)

这是我的剧本

S=$(hive -S -hiveconf MY_VAR1=$DB -hiveconf MY_VAR2=$avgpay -hiveconf MY_VAR3=$Date_LastDay -hiveconf MY_VAR4=$Date_LastNmonth -f hv.hql)

`mysql ...`
S1=( $( for k in $S ; do echo $k ; done ) )
    cntn=${#S1[@]}
    for (( p=0 ; p<$cntn; p=p+5 ))
     do
     `mysql ...`
     done

这是工作流程

<workflow-app name="shell-wf" xmlns="uri:oozie:workflow:0.4" >
<start to="shellbpxp"/>
<action name="shellbpxp">
<shell xmlns="uri:oozie:shell-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
 <property>
    <name>mapred.job.queue.name</name>
    <value>${queueName}</value>
 </property>
</configuration>
 <exec>netcool.sh</exec>
   <file>netcool.sh#netcool.sh</file>
   <file>hv.hql#hv.hql</file>
</shell>
 <ok to="end" />
<error to="fail" />
</action>
 <kill name="fail">
    <message>Script failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>
 </kill>
<end name='end' />
</workflow-app>
guykilcj

guykilcj1#

您需要将hive-config.xml文件作为workflow.xml中的文件参数,如下所示:

<workflow-app name="shell-wf" xmlns="uri:oozie:workflow:0.4" > 
<start to="shellbpxp"/> 
<action name="shellbpxp"> 
<shell xmlns="uri:oozie:shell-action:0.1">

   <job-tracker>${jobTracker}</job-tracker>
   <name-node>${nameNode}</name-node>
   <job-xml>/user/<<your_path>>/hive-config.xml</job-xml> <configuration>  <property>
       <name>mapred.job.queue.name</name>
       <value>${queueName}</value>  </property> </configuration>  <exec>netcool.sh</exec>    <file>netcool.sh#netcool.sh</file>   
   <file>hv.hql#hv.hql</file>   
   <file>/user/<<your_path>>/hive-config.xml#hive-config.xml</file> </shell>  <ok to="end" /> <error to="fail" /> </action>  <kill
   name="fail">
       <message>Script failed, error message[${wf:errorMessage(wf:lastErrorNode())}]</message>  </kill>
   <end name='end' /> </workflow-app>

您可以在/etc/hive/conf目录下找到hive-config.xml文件。您需要注解掉hive-config.xml文件中的fs.defaultfs节点。
hive-config.xml包含连接到数据库的元存储URI。

相关问题