无法使用java连接到hbase

2g32fytz  于 2021-06-09  发布在  Hbase
关注(0)|答案(4)|浏览(506)

我正在尝试使用java连接hbase。只有一个节点,这是我自己的机器。我似乎无法成功地连接。
以下是我的java代码:

public class Test {
  public static void main(String[] args) throws MasterNotRunningException, ZooKeeperConnectionException, IOException, ServiceException {        
    SparkConf conf = new SparkConf().setAppName("Test").setMaster("spark://10.239.58.111:7077");
    JavaSparkContext sc = new JavaSparkContext(conf);
    sc.addJar("/home/cloudera/workspace/Test/target/Test-0.0.1-SNAPSHOT.jar");
    Configuration hbaseConf = HBaseConfiguration.create();
    hbaseConf.addResource(new Path("/usr/lib/hbase/conf/hbase-site.xml"));
    HTable table = new HTable(hbaseConf, "rdga_by_id");
  }
}

我试着用这样的代码设置环境,

hbaseConf.set("hbase.master", "localhost");
hbaseConf.set("hbase.master.port", "60000");
hbaseConf.set("hbase.zookeeper.property.clientPort", "2181");
hbaseConf.set("hbase.zookeeper.quorum", "quickstart.cloudera");
hbaseConf.set("hbase.zookeeper.quorum", "localhost");

但还是不行。
以下是hbase-site.xml:

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
  <property>
    <name>hbase.rest.port</name>
    <value>8070</value>
    <description>The port for the HBase REST server.</description>
  </property>
  <property>
  <name>hbase.cluster.distributed</name>
    <value>true</value>
  </property>
  <property>
    <name>hbase.rootdir</name>
    <value>hdfs://quickstart.cloudera:8020/hbase</value>
  </property>
  <property>
    <name>hbase.regionserver.ipc.address</name>
    <value>0.0.0.0</value>
  </property>
  <property>
    <name>hbase.master.ipc.address</name>
    <value>0.0.0.0</value>
  </property>
  <property>
    <name>hbase.thrift.info.bindAddress</name>
    <value>0.0.0.0</value>
  </property>
</configuration>

在服务器运行时的WebUI页面中,它显示servername是“quickstart.cloudera,162011422941563375”。
错误在于

2015-02-02 22:17:03,121 INFO  [main] zookeeper.ZooKeeper (ZooKeeper.java:<init>(438)) - Initiating client connection, connectString=quickstart.cloudera:16201 sessionTimeout=90000 watcher=hconnection-0x62ad0636, quorum=quickstart.cloudera:16201, baseZNode=/hbase
Exception in thread "main" java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:413)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:390)
at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:271)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:198)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:160)
at Test.main(Test.java:52)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:411)
... 12 more
Caused by: java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace
at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:216)
at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:839)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:642)
... 17 more
Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 23 more

很抱歉让你们读了这么多代码。提前谢谢

s8vozzvw

s8vozzvw1#

原因:

java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace

基于这一行,包含htrace-core.jar的错误堆栈跟踪可能会有所帮助。

tkqqtvp1

tkqqtvp12#

像这样提供这个jar的完整路径->

sc.addJar("htrace-core.jar");
sqougxex

sqougxex3#

我在没有spark的情况下从java库连接到hbase时也遇到了这个错误。我已经添加了下面的类路径,但它没有工作

export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/lib/hbase/*

但后来我添加了hbase solr path,因为htrace jar在这个路径中

export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/lib/hbase/*:/usr/lib/hbase-solr/lib/*

希望这对你有用。

llew8vvj

llew8vvj4#

对于spark hbase集成,最好的方法是将hbase库添加到spark类路径。这可以使用$spark\u home/bin文件夹中的“compute classpath.sh”脚本来完成。spark调用“compute classpath.sh”并获取所需的hbase jar。

export CLASSPATH=$CLASSPATH:<path/to/HBase/lib/*>

例如:export classpath=$classpath:/opt/cloudera/parcels/cdh/lib/hbase/lib/*
之后,重新启动spark。
给你:)

相关问题