运行hadoopizer时出错

yvfmudvl  于 2021-05-29  发布在  Hadoop
关注(0)|答案(0)|浏览(151)

我正在尝试使用以下命令运行Hadoop:

hadoop jar hadd.jar -c config.xml -w /home/salma/hdfs/tmp

其中“/home/salma/hdfs/tmp”指定hdfs文件系统上的一个目录,hadoop将在其中写入一些临时数据

INFO: Adding file 'file:/home/salma/Desktop/data_rna_seq/chr22_ERCC92.fa' to distributed cache (/home/salma/hdfs/tmp/static_data/db/chr22_ERCC92.fa)
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/filecache/DistributedCache
    at org.genouest.hadoopizer.Hadoopizer.addToDistributedCache(Hadoopizer.java:469)
    at org.genouest.hadoopizer.Hadoopizer.prepareJob(Hadoopizer.java:282)
    at org.genouest.hadoopizer.Hadoopizer.main(Hadoopizer.java:71)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.filecache.DistributedCache

这是addtodistributedcache方法,代码为java:

private void addToDistributedCache(String fileId, URI uri, Path hdfsBasePath) throws IOException {

        FileSystem fs = hdfsBasePath.getFileSystem(jobConf);
        Path localPath = new Path(uri);
        Path hdfsPath = new Path(hdfsBasePath.toString() + Path.SEPARATOR + localPath.getName());

        if (uri.getScheme().equalsIgnoreCase("file")) {
            logger.info("Adding file '" + uri + "' to distributed cache (" + hdfsPath + ")");
            fs.copyFromLocalFile(false, true, localPath, hdfsPath);
        }
        else if (uri.getScheme().equalsIgnoreCase("hdfs")) {
            logger.info("Adding file '" + uri + "' to distributed cache");
            hdfsPath = localPath;
        }
        else {
            // TODO support other protocols (s3? ssh? http? ftp?)
            System.err.println("Unsupported URI scheme: " + uri.getScheme() + " (in " + uri + ")");
            System.exit(1);
        }

        // Add a fragment to the uri: hadoop will automatically create a symlink in the work dir pointing to this file
        // Don't add the fragment to hdfsPath because it would be encoded in a strange way
        URI hdfsUri = URI.create(hdfsPath.toString() + "#" + jobConf.get("hadoopizer.static.data.link.prefix") + fileId + "__" + localPath.getName());
        DistributedCache.addCacheFile(hdfsUri, jobConf);
    }

谁能给我解释一下这个错误吗。
我正在使用hadoop2.7.3

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题