oozie配置单元操作使用带有kerberos的配置单元元存储服务器

kadbb459  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(351)

我用的是cdh5。我已经设置了一个hive元存储来使用kerberos。i、 e.hive-site.xml具有以下属性

<property>
  <name>hive.metastore.sasl.enabled</name>
  <value>true</value>
</property>

<property>
  <name>hive.metastore.kerberos.keytab.file</name>
  <value>/etc/hive/conf/hive.keytab</value>
</property>

<property>
  <name>hive.metastore.kerberos.principal</name>
  <value>hive/hive-metastore.example.com@example.COM</value>
</property>

日志显示启动配置单元元存储服务时没有错误。
我正在尝试在oozie工作流中运行配置单元操作。oozie-site.xml文件具有以下属性

<property>
  <name>oozie.credentials.credentialclasses</name>
  <value>hcat=org.apache.oozie.action.hadoop.HCatCredentials</value>
</property>

并且工作流xml文件具有credentials标记

<credentials>
    <credential name='hive_credentials' type='hcat'>
          <property>
              <name>hcat.metastore.uri</name>
              <value>thrift://hive-metastore.example.com:9083</value>
          </property>
          <property>
              <name>hcat.metastore.principal</name>
              <value>hive/hadoop-metastore.example.com@example.COM</value>
          </property>
     </credential>
</credentials>

配置单元操作引用使用“cred”属性的凭据。

<action name="hive" cred="hive_credentials">
    <hive xmlns="uri:oozie:hive-action:0.2">
        <job-tracker>${jobTracker}</job-tracker>
        <name-node>${nameNode}</name-node>
        <job-xml>${appPath}/hive-site.xml</job-xml>
        <configuration>
            <property>
                <name>mapred.job.queue.name</name>
                <value>${queueName}</value>
            </property>
        </configuration>
        <script>${appPath}/queries.hql</script>
    </hive>
    <ok to="pass"/>
    <error to="fail"/>
</action>

当我尝试运行此工作流时,出现以下错误。

Exception in addtoJobConf
MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: No common protection layer between client and server
        at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:221)
        at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:297)
        at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
        at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:288)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:169)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:109)
        at org.apache.oozie.action.hadoop.HCatCredentialHelper.getHCatClient(HCatCredentialHelper.java:87)
        at org.apache.oozie.action.hadoop.HCatCredentialHelper.set(HCatCredentialHelper.java:52)
        at org.apache.oozie.action.hadoop.HCatCredentials.addtoJobConf(HCatCredentials.java:58)
        at org.apache.oozie.action.hadoop.JavaActionExecutor.setCredentialTokens(JavaActionExecutor.java:990)
        at org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:851)
        at org.apache.oozie.action.hadoop.JavaActionExecutor.start(JavaActionExecutor.java:1071)
        at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:217)
        at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:62)
        at org.apache.oozie.command.XCommand.call(XCommand.java:280)
        at org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:323)
        at org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:252)
        at org.apache.oozie.service.CallableQueueService$CallableWrapper.run(CallableQueueService.java:174)
        at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
        at java.lang.Thread.run(Thread.java:662)
)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:334)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:169)
        at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:109)
        at org.apache.oozie.action.hadoop.HCatCredentialHelper.getHCatClient(HCatCredentialHelper.java:87)
        at org.apache.oozie.action.hadoop.HCatCredentialHelper.set(HCatCredentialHelper.java:52)
        at org.apache.oozie.action.hadoop.HCatCredentials.addtoJobConf(HCatCredentials.java:58)
        at org.apache.oozie.action.hadoop.JavaActionExecutor.setCredentialTokens(JavaActionExecutor.java:990)
        at org.apache.oozie.action.hadoop.JavaActionExecutor.submitLauncher(JavaActionExecutor.java:851)
        at org.apache.oozie.action.hadoop.JavaActionExecutor.start(JavaActionExecutor.java:1071)
        at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:217)
        at org.apache.oozie.command.wf.ActionStartXCommand.execute(ActionStartXCommand.java:62)
        at org.apache.oozie.command.XCommand.call(XCommand.java:280)
        at org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:323)
        at org.apache.oozie.service.CallableQueueService$CompositeCallable.call(CallableQueueService.java:252)
        at org.apache.oozie.service.CallableQueueService$CallableWrapper.run(CallableQueueService.java:174)
        at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
        at java.lang.Thread.run(Thread.java:662)

知道是什么导致了这个问题吗?

2ledvvac

2ledvvac1#

我的工作流程使用:

hive/_HOST@example.COM

作为 hcat.metastore.principal ,而不是:

hive/hadoop-metastore.example.com@example.COM

你能试试吗?

相关问题