hadoop2.7.1我有一个eclipse(maven)项目。我能够运行wordcount hadoop示例,因此hadoop配置正确。如果我尝试在运行时示例化一个使用apache jena模型类的对象,则会引发以下错误:
Exception in thread "main" java.lang.NoClassDefFoundError:
com/hp/hpl/jena/rdf/model/Model
at hadoop.wordcount.WordCount.main(WordCount.java:20)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: com.hp.hpl.jena.rdf.model.Model
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
同一个对象(使用jena的对象)在独立(没有hadoop)项目中工作。当我尝试运行stats hadoop示例时,同样的错误也会发生。要创建我运行的jar:
mvn clean package
如果我尝试使用其他lib中的类,那么这些类将不会“包含”在生成的jar中。我错在哪里?有什么建议吗?现在我没有主意了!
编辑#1:
我试着编译:
mvn clean compile assembly:single
使用此程序集配置:
<assembly>
<id>hadoop-job</id>
<formats>
<format>jar</format>
</formats>
<includeBaseDirectory>false</includeBaseDirectory>
<dependencySets>
<dependencySet>
<unpack>false</unpack>
<scope>provided</scope>
<outputDirectory>lib</outputDirectory>
<excludes>
<exclude>${groupId}:${artifactId}</exclude>
</excludes>
</dependencySet>
<dependencySet>
<unpack>true</unpack>
<includes>
<include>${groupId}:${artifactId}</include>
</includes>
</dependencySet>
</dependencySets>
</assembly>
但我仍然面临同样的问题。
编辑#2:在我的情况下,这是有效的:
在pom.xml中包含此插件
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.3</version>
<configuration>
<shadedArtifactAttached>true</shadedArtifactAttached>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
<exclude>META-INF/LICENSE*</exclude>
<exclude>license/*</exclude>
</excludes>
</filter>
</filters>
</configuration>
</execution>
</executions>
</plugin>
然后
mvn clean compile package
(解决方案来自@garry in post:hadoop java.io.ioexception:mkdirs未能创建/some/path)
现在它可以工作了,但是在“解压”过程中需要很多时间。任何已知的解决方法?
暂无答案!
目前还没有任何答案,快来回答吧!