hadoop map reduce与guava不兼容

rkttyhzu  于 2021-06-03  发布在  Hadoop
关注(0)|答案(1)|浏览(386)

我尝试在hadoop中运行简单的map reduce示例,这是我的主程序:

Configuration configuration = new Configuration();

    Job job = new Job(configuration, "conf");
    job.setMapperClass(MapClass.class);

    int numreducers = 1;

    job.setNumReduceTasks(numreducers);
    job.setOutputKeyClass(Text.class);
    job.setOutputValueClass(Text.class);

    FileInputFormat.addInputPath(job, new Path("/user/www/input"));
    FileOutputFormat.setOutputPath(job, new Path("/user/www/output/"));
    System.exit(job.waitForCompletion(true) ? 0 : 1);

我正在使用这些库hadoop-2.0.0和guava 14.0.1,这个程序给出了一个例外:

Exception in thread "main" java.lang.IncompatibleClassChangeError: class com.google.common.cache.CacheBuilder$3 has interface com.google.common.base.Ticker as super class
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:791)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
    at com.google.common.cache.CacheBuilder.<clinit>(CacheBuilder.java:190)
    at org.apache.hadoop.hdfs.DomainSocketFactory.<init>(DomainSocketFactory.java:46)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:456)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:410)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:128)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2308)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:87)
    at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2342)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2324)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:351)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:163)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:335)
    at org.apache.hadoop.fs.Path.getFileSystem(Path.java:194)
    at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.addInputPath(FileInputFormat.java:368)

似乎是库版本不匹配的问题。。我该怎么修。

jum4pzuy

jum4pzuy1#

检查此链接可能会帮助您-单击此处
如果不能解决,可能是类路径中有多余的Guavajar导致了这个异常。
你能提供hadoop lib目录中的文件列表吗?

相关问题