我已从hortonworks升级到最近的hadoop:
Hadoop 2.4.0.2.1.2.1-471
Subversion git@github.com:hortonworks/hadoop.git -r 9e5db004df1a751e93aa89b42956c5325f3a4482
Compiled by jenkins on 2014-05-27T18:57Z
Compiled with protoc 2.5.0
From source with checksum 9e788148daa5dd7934eb468e57e037b5
This command was run using /usr/lib/hadoop/hadoop-common-2.4.0.2.1.2.1-471.jar
在升级之前,我编写了一个javamrd程序,它将配置单元表用于输入和输出。在以前的hadoop版本中,它可以工作,尽管我在编译时收到了以下代码的弃用警告:
Job job = new Job(conf, "Foo");
HCatInputFormat.setInput(job,InputJobInfo.create(dbName, inputTableName, null));
现在,在hadoop 2.4.0.2.1.2.1-471中将依赖项更新为新的JAR并运行相同的代码后,出现以下错误:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hcatalog/mapreduce/InputJobInfo
at com.bigdata.hadoop.Foo.run(Foo.java:240)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at com.bigdata.hadoop.Foo.main(Foo.java:272)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.lang.ClassNotFoundException: org.apache.hcatalog.mapreduce.InputJobInfo
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 9 more
要运行代码,我使用以下设置:
export libjars=/usr/lib/hive hcatalog/share/hcatalog/hive hcatalog core.jar,/usr/lib/hive/lib/hive exec.jar,/usr/lib/hive/lib/hive metastore.jar,/usr/lib/hive/lib/libfb303-0.9.0.jar,/usr/lib/hive/lib/jdo-api-3.0.1.jar,/usr/lib/hive/lib/antlr-runtime-3.4.jar,/usr/lib/hive/lib/datanucleus-api-jdo-3.2.6.jar,/usr/lib/hive/lib/datanucleus-core-3.2.10.jar
export hadoop\u classpath=/usr/lib/hive hcatalog/share/hcatalog/hive hcatalog core.jar,/usr/lib/hive/lib/hive exec.jar,/usr/lib/hive/lib/hive metastore.jar,/usr/lib/hive/lib/libfb303-0.9.0.jar,/usr/lib/hive/lib/jdo-api-3.0.1.jar,/usr/lib/hive/lib/antlr-runtime-3.4.jar,/usr/lib/hive/lib/datanucleus-api-jdo-3.2.6.jar,/usr/lib/hive/lib/datanucleus-core-3.2.10.jar
你知道为什么我会得到java.lang.noclassdeffounderror:org/apache/hcatalog/mapreduce/inputjobinfo吗?
2条答案
按热度按时间2guxujil1#
我认为应该在pom.xml中添加以下依赖项。
mpbci0fu2#
我确实面临着同样的问题。在您的情况下,类路径需要以下jar: