hbase一直在做简单的身份验证

xkftehaa  于 2021-06-03  发布在  Hadoop
关注(0)|答案(4)|浏览(567)

我正在尝试设置hbase来验证安全hdfs和zookeeper。hmaster已成功与zookeeper进行身份验证。但是,它一直使用hdfs进行简单的身份验证。不知道我的配置里少了什么。

<property>
    <name>hbase.rootdir</name>
    <value>hdfs://dev1.example.com:8020/hbase</value>
  </property>
  <property>
    <name>hbase.tmp.dir</name>
    <value>/mnt/hadoop/hbase</value>
  </property>
  <property>
    <name>hbase.cluster.distributed</name>
    <value>true</value>
  </property>

  <!-- authentication -->
  <property>
     <name>hbase.security.authentication</name>
     <value>kerberos</value>
  </property>
  <property>
     <name>hbase.rpc.engine</name>
     <value>org.apache.hadoop.hbase.ipc.SecureRpcEngine</value>
  </property>
  <property>
    <name>hbase.regionserver.kerberos.principal</name>
    <value>hbase/_HOST@EXAMPLE.COM</value>
  </property>
  <property>
    <name>hbase.regionserver.keytab.file</name>
    <value>/etc/security/keytab/hbase.keytab</value>
  </property>
  <property>
    <name>hbase.master.kerberos.principal</name>
    <value>hbase/_HOST@EXAMPLE.COM</value>
  </property>
  <property>
    <name>hbase.master.keytab.file</name>
    <value>/etc/security/keytab/hbase.keytab</value>
  </property>

下面是来自hmaster的日志:

2014-01-24 17:14:59,278 DEBUG [master:dev1:60000] ipc.Client: Connecting to dev1.example.com/192.168.11.101:8020
2014-01-24 17:14:59,457 DEBUG [IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase] ipc.Client: IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase: starting, having connections 1
2014-01-24 17:14:59,465 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase sending #0
2014-01-24 17:14:59,491 DEBUG [IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase] ipc.Client: IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase got value #-1
2014-01-24 17:14:59,499 DEBUG [IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase] ipc.Client: closing ipc connection to dev1.example.com/192.168.11.101:8020: SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
    at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1042)
    at org.apache.hadoop.ipc.Client$Connection.run(Client.java:891)
2014-01-24 17:14:59,504 DEBUG [IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase] ipc.Client: IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase: closed
2014-01-24 17:14:59,504 DEBUG [IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase] ipc.Client: IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase: stopped, remaining connections 0
2014-01-24 17:14:59,514 DEBUG [master:dev1:60000] ipc.Client: The ping interval is 60000 ms.
2014-01-24 17:14:59,514 DEBUG [master:dev1:60000] ipc.Client: Connecting to dev1.example.com/192.168.11.101:8020
2014-01-24 17:14:59,531 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase sending #1
2014-01-24 17:14:59,531 DEBUG [IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase] ipc.Client: IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase: starting, having connections 1
2014-01-24 17:14:59,532 DEBUG [IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase] ipc.Client: IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase got value #-1
2014-01-24 17:14:59,532 DEBUG [IPC Client (1780703664) connection to dev1.example.com/192.168.11.101:8020 from hbase] ipc.Client: closing ipc connection to dev1.example.com/192.168.11.101:8020: SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
    at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1042)
    at org.apache.hadoop.ipc.Client$Connection.run(Client.java:891)
2014-01-24 17:14:59,536 FATAL [master:dev1:60000] master.HMaster: Unhandled exception. Starting shutdown.
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]
    at org.apache.hadoop.ipc.Client.call(Client.java:1347)
    at org.apache.hadoop.ipc.Client.call(Client.java:1300)
    at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
    at com.sun.proxy.$Proxy12.setSafeMode(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:606)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
    at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
    at com.sun.proxy.$Proxy12.setSafeMode(Unknown Source)
    at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.setSafeMode(ClientNamenodeProtocolTranslatorPB.java:561)
    at org.apache.hadoop.hdfs.DFSClient.setSafeMode(DFSClient.java:2146)
    at org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:983)
    at org.apache.hadoop.hdfs.DistributedFileSystem.setSafeMode(DistributedFileSystem.java:967)
    at org.apache.hadoop.hbase.util.FSUtils.isInSafeMode(FSUtils.java:432)
    at org.apache.hadoop.hbase.util.FSUtils.waitOnSafeMode(FSUtils.java:851)
    at org.apache.hadoop.hbase.master.MasterFileSystem.checkRootDir(MasterFileSystem.java:435)
    at org.apache.hadoop.hbase.master.MasterFileSystem.createInitialFileSystemLayout(MasterFileSystem.java:146)
    at org.apache.hadoop.hbase.master.MasterFileSystem.<init>(MasterFileSystem.java:127)
    at org.apache.hadoop.hbase.master.HMaster.finishInitialization(HMaster.java:789)
    at org.apache.hadoop.hbase.master.HMaster.run(HMaster.java:606)
    at java.lang.Thread.run(Thread.java:744)
2014-01-24 17:14:59,537 INFO  [master:dev1:60000] master.HMaster: Aborting
2014-01-24 17:14:59,537 DEBUG [master:dev1:60000] master.HMaster: Stopping service threads
2014-01-24 17:14:59,538 INFO  [master:dev1:60000] ipc.RpcServer: Stopping server on 60000

我一直在寻找原因,但仍然没有运气。

ep6jt1vc

ep6jt1vc1#

我在hbase主机上收到此错误:

org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled.  Available:[TOKEN, KERBEROS]

正如上面virya所说,问题是hbase缺少环境变量。
如果您编辑hbase-env.sh并添加hadoop环境变量,那么应该可以纠正这个问题。

export HADOOP_HOME="/opt/hadoop"  # REPLACE WITH YOUR INSTALL DIRECTORY FOR HADOOP
export HADOOP_HDFS_HOME="${HADOOP_HOME}"
export HADOOP_MAPRED_HOME="${HADOOP_HOME}"
export HADOOP_YARN_HOME="${HADOOP_HOME}"
export HADOOP_CONF_DIR="${HADOOP_HOME}/etc/hadoop"

此错误的原因

hbase似乎从hadoops配置文件和安装目录中读取各种设置。hbase读取 hdfs-site.xml 并确定hdfs集群是否自动在安全模式下运行。如果找不到这个文件,它默认假设hdfs不在安全模式下,并且您收到上面的错误。

s3fp2yjn

s3fp2yjn2#

在最新版本中,hadoop更改了默认的hdfs端口,只需检查namenode侦听端口。在我的hadoopv选项中。3.2.1我解决了问题,只是将属性添加到hbase-site.xml中:

<property>
    <name>hbase.rootdir</name>
    <value>hdfs://localhost:9000/hbase</value>
  </property>
qmelpv7a

qmelpv7a3#

我的一个队友通过以下任一方法解决了这个问题:
复制 core-site.xml 以及 hdfs-site.xmlhbase_dir/confhadoop_dir/etc/hadoop 进入 $HBASE_CLASSPATH 必须在主计算机和区域计算机上执行这些操作。
我们在同一个框中部署hbase master和hdfs name node;hbase区域和hdfs数据节点也在同一个框中。

ddarikpa

ddarikpa4#

我最近尝试设置kerberos安全的hadoop集群,遇到了同样的问题。我发现原因是我的hbase用户(用于启动hbase服务的用户)没有hadoop的必要环境设置,比如hadoop\u conf\u dir。您可以验证您的用户帐户是否具有这些设置。

相关问题