csv类未找到异常

0sgqnhkj  于 2021-06-02  发布在  Hadoop
关注(0)|答案(1)|浏览(424)

我有一个csv文件上传到hdfs。我使用opencsv解析器来读取数据。我的jar文件也在hadoop类路径中,它上传到hdfs中的以下位置/jars/opencsv-3.9.jar。我得到的错误也附上。
这是我的代码片段

public class TermLabelledPapers {

   public static class InputMapper extends Mapper<LongWritable, Text, Text, Text> {

    @Override
    protected void map(LongWritable key, Text value, Context context)
            throws IOException, InterruptedException {

        CSVParser parser = new CSVParser();
        String[] lines = parser.parseLine(value.toString());
        //readEntry.readHeaders();
        String doi = lines[0];
        String keyphrases = lines[3];

        Get g = new Get(Bytes.toBytes(doi.toString()));
        context.write(new Text(doi), new Text(keyphrases));

    }
}

public static class PaperEntryReducer extends TableReducer<Text, Text, ImmutableBytesWritable> {

    @Override
    protected void reduce(Text doi, Iterable<Text> values, Context context)
            throws IOException, InterruptedException {

    }
}

public static void main(String[] args) throws Exception {

    Configuration conf = HBaseConfiguration.create();
    conf.set("hbase.zookeeper.quorum", "172.17.25.18");
    conf.set("hbase.zookeeper.property.clientPort", "2183");
    //add the external jar to hadoop distributed cache 
    //addJarToDistributedCache(CsvReader.class, conf);

    Job job = new Job(conf, "TermLabelledPapers");
    job.setJarByClass(TermLabelledPapers.class);
    job.setMapperClass(InputMapper.class);
    job.setMapOutputKeyClass(Text.class);
    job.setMapOutputValueClass(Text.class);
    job.addFileToClassPath(new Path("/jars/opencsv-3.9.jar"));
    FileInputFormat.setInputPaths(job, new Path(args[0]));  // "metadata.csv"

    TableMapReduceUtil.initTableReducerJob("PaperBagofWords", PaperEntryReducer.class, job);
    job.setReducerClass(PaperEntryReducer.class);
    job.waitForCompletion(true);
 }

}

运行作业后显示的错误是

Error: java.lang.ClassNotFoundException: com.csvreader.CsvReader
 at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at mcad.TermLabelledPapers$InputMapper.map(TermLabelledPapers.java:69)
at mcad.TermLabelledPapers$InputMapper.map(TermLabelledPapers.java:1)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
tjrkku2a

tjrkku2a1#

理想情况下,如果jar位于hadoop类路径中,则不应出现此错误。如果您是一个maven项目,您可以尝试创建带有依赖项的jar,它将包含所有依赖jar以及您的jar。这有助于诊断问题。

相关问题