我们正在尝试使用java API连接到kerberos安全的hbase群集,但在创建新表时遇到连接被拒绝错误。我们的代码如下:
conf.addResource("/path/to/hbase-site.xml");
conf.addResource("/path/to/core-site.xml");
conf.addResource("/path/to/hdfs-site.xml");
conf.set("hbase.zookeeper.quorum", "our host");
conf.set("hbase.zookeeper.property.clientPort", "2181");
conf.set("java.security.auth.login.config", "/path/to/jaas.conf");
conf.set("hadoop.security.authentication", "kerberos");
System.setProperty("java.security.krb5.conf", "/etc/krb5.conf");
String principal = "ourUser@ourRealm";
UserGroupInformation.setConfiguration(conf);
UserGroupInformation ugi = UserGroupInformation.getUGIFromTicketCache("/tmp/krb5cc_1001", "ourUser@ourRealm");
// Establish connection
Connection connection = ugi.doAs(new PrivilegedExceptionAction<Connection>() {
@Override
public Connection run() throws Exception {
return ConnectionFactory.createConnection(conf);
}
});
查看调试日志,似乎我们的用户没有建立到数据库的连接的权限,但是该用户能够通过ssh连接到集群并执行hbase操作,而不会出现问题:
2020-09-10 10:43:18,849 DEBUG [main] security.UserGroupInformation: Hadoop login
2020-09-10 10:43:18,852 DEBUG [main] security.UserGroupInformation: hadoop login commit
2020-09-10 10:43:18,858 DEBUG [main] security.UserGroupInformation: Using kerberos user: ourUser@ourRealm
2020-09-10 10:43:18,859 DEBUG [main] security.UserGroupInformation: Using user: "ourUser@ourRealm" with name: ourUser@ourRealm
2020-09-10 10:43:18,860 DEBUG [main] security.UserGroupInformation: User entry: "ourUser@ourRealm"
2020-09-10 10:43:18,876 DEBUG [main] security.UserGroupInformation: PrivilegedAction [as: ourUser@ourRealm (auth:KERBEROS)][action: dataset.YarnQuery$1@27ff5d15]
java.lang.Exception
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1842)
at dataset.YarnQuery.writeToHbase(YarnQuery.java:110)
at dataset.Main.main(Main.java:17)
2020-09-10 10:43:18,942 DEBUG [main] security.UserGroupInformation: PrivilegedAction [as: ourUser@ourRealm (auth:KERBEROS)][action: org.apache.hadoop.hbase.client.ConnectionFactory$$Lambda$10/652433136@4b8ee4de]
我们正在接收服务器拒绝的连接。此时,程序继续尝试重新连接,并将挂起:
2020-09-10 10:43:27,991 DEBUG [Thread-0] conn.PoolingHttpClientConnectionManager: Closing expired connections
2020-09-10 10:43:27,991 DEBUG [Thread-0] conn.PoolingHttpClientConnectionManager: Closing connections idle longer than 30 SECONDS
2020-09-10 10:43:29,485 DEBUG [ReadOnlyZKClient-HOSTIP:2181@0x7bba5817-SendThread(HOSTIP:2181)] zookeeper.ClientCnxn: Reading reply sessionid:0x2739f7ec42b3960, packet:: clientPath:/hbase serverPath:/hbase finished:false header:: 16,8 replyHeader:: 16,472447401714,0 request:: '/hbase,F response:: v{'replication,'meta-region-server,'rs,'splitWAL,'backup-masters,'table-lock,'flush-table-proc,'region-in-transition,'online-snapshot,'acl,'master,'running,'balancer,'recovering-regions,'tokenauth,'draining,'namespace,'hbaseid,'table}
2020-09-10 10:43:29,535 DEBUG [ReadOnlyZKClient-HOSTIP:2181@0x7bba5817-SendThread(HOSTIP:2181)] zookeeper.ClientCnxn: Reading reply sessionid:0x2739f7ec42b3960, packet:: clientPath:/hbase/meta-region-server serverPath:/hbase/meta-region-server finished:false header:: 17,4 replyHeader:: 17,472447401714,0 request:: '/hbase/meta-region-server,F response:: #ffffffff0001a726567696f6e7365727665723a3630303230fffffffcffffffe565ffffff826dffffff89ffffffdfffffffe150425546a25a1879626f6c636c647230322e796f746162697465732e636f6d10fffffff4ffffffd4318ffffffddffffff95fffffff9ffffffccffffffc02e100183,s{472446896369,472446896369,1597889257304,1597889257304,0,0,0,0,78,0,472446896369}
2020-09-10 10:43:29,639 DEBUG [Default-IPC-NioEventLoopGroup-1-8] ipc.NettyRpcDuplexHandler: Unknown callId: -1, skipping over this response of 0 bytes
2020-09-10 10:43:29,641 DEBUG [main] client.RpcRetryingCallerImpl: elHandlerContext.java:228)
at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:912)
at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:827)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404)
at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:495)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:905)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed
... 25 more
暂无答案!
目前还没有任何答案,快来回答吧!