连接关闭,尝试连接到kerberized hbase

unftdfkk  于 2021-05-26  发布在  Spark
关注(0)|答案(0)|浏览(485)

我正在尝试从hdp集群外部连接到kerberized hbase。我有以下配置(请参阅下面的代码),并尝试在spark submit中的“--files”选项下传递hbase站点。在这两种情况下,我都会看到以下异常(我屏蔽了主机名)。不确定是什么原因导致连接关闭:

Caused by: java.net.SocketTimeoutException: callTimeout=60000, callDuration=69264: Connection closed row 'mytable,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=myhbase-master,16020,1602115323155, seqNum=-1
        at org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:159)
        at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:80)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed
        at org.apache.hadoop.hbase.ipc.NettyRpcDuplexHandler.channelInactive(NettyRpcDuplexHandler.java:211)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
        at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:377)
        at org.apache.hbase.thirdparty.io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:342)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
        at org.apache.hbase.thirdparty.io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
        at org.apache.hbase.thirdparty.io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:277)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
        at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1354)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
        at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:917)
        at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:822)
        at org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
        at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
        at org.apache.hbase.thirdparty.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
        at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
        at org.apache.hbase.thirdparty.io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
 SparkConf conf=new SparkConf().setAppName("Hbase SPARK").setMaster("local");

        JavaSparkContext javaSparkContext=new JavaSparkContext(conf);

        System.setProperty("java.security.krb5.conf", "/etc/krb5.conf");
        System.setProperty("sun.security.krb5.debug", "true");
        System.setProperty("HADOOP_CONF_DIR","/pathto/hadoopconf/");
        System.setProperty("HADOOP_CONF_DIR","/pathto/hadoopconf/");
        final Configuration config = HBaseConfiguration.create();               

        config.set("hadoop.security.authentication", "kerberos");
        config.set("hbase.security.authentication", "true");
        config.set("hbase.rpc.protection", "authentication");
        config.set("hbase.master.kerberos.principal", "hbase/_HOST@myhost.com");
        config.set("hbase.regionserver.kerberos.principal", "hbase/_HOST@myhost.com");
        config.set("hbase.zookeeper.quorum", "xxxx.yyy.zzz.com,xxx.yyy..zzz.com");
        conf.set("hbase.connection-timeout", "60000");
        conf.set("hbase.zookeeper.session.timeout","30000");
        conf.set("hbase.hbase.client.retries.number","3");
        conf.set("hbase.hbase.client.pause","1000");
        conf.set("hbase.zookeeper.recovery.retry","1");
        config.set("hbase.zookeeper.property.clientPort", "2181");
        config.set("zookeeper.znode.parent", "/hbase-secure");
        config.set(TableInputFormat.INPUT_TABLE, "mytable");
      //  create the connection :

   try {
        UserGroupInformation.setConfiguration(config);
        UserGroupInformation.loginUserFromKeytab(principal, keytabLocation);

            connection = ConnectionFactory.createConnection(config);
            System.out.println("connected************"+connection.getConfiguration().toString());
        } catch (IOException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        } 

     // read the data 
 JavaPairRDD<ImmutableBytesWritable, Result> javaPairRdd = javaSparkContext.newAPIHadoopRDD(conn.getConfiguration(), TableInputFormat.class,ImmutableBytesWritable.class, Result.class);
        System.out.println("Count:"+javaPairRdd.count()); ```

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题