我已经在一个节点上测试了一个map reduce作业,它似乎可以工作,但是现在我正在尝试在一个远程集群上运行它,我得到了一个classnotfoundexcepton。我的代码结构如下:
public class Pivot {
public static class Mapper extends TableMapper<ImmutableBytesWritable, ImmutableBytesWritable> {
@Override
public void map(ImmutableBytesWritable rowkey, Result values, Context context) throws IOException {
(map code)
}
}
public static class Reducer extends TableReducer<ImmutableBytesWritable, ImmutableBytesWritable, ImmutableBytesWritable> {
public void reduce(ImmutableBytesWritable key, Iterable<ImmutableBytesWritable> values, Context context) throws IOException, InterruptedException {
(reduce code)
}
}
public static void main(String[] args) {
Configuration conf = HBaseConfiguration.create();
conf.set("fs.default.name", "hdfs://hadoop-master:9000");
conf.set("mapred.job.tracker", "hdfs://hadoop-master:9001");
conf.set("hbase.master", "hadoop-master:60000");
conf.set("hbase.zookeeper.quorum", "hadoop-master");
conf.set("hbase.zookeeper.property.clientPort", "2222");
Job job = new Job(conf);
job.setJobName("Pivot");
job.setJarByClass(Pivot.class);
Scan scan = new Scan();
TableMapReduceUtil.initTableMapperJob("InputTable", scan, Mapper.class, ImmutableBytesWritable.class, ImmutableBytesWritable.class, job);
TableMapReduceUtil.initTableReducerJob("OutputTable", Reducer.class, job);
job.waitForCompletion(true);
}
}
尝试运行此作业时收到的错误如下:
java.lang.RuntimeException: java.lang.ClassNotFoundException: Pivot$Mapper
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:857)
...
有什么我不知道的吗?为什么工作很难找到Map绘制者?
1条答案
按热度按时间mxg2im7a1#
从eclipse运行作业时,需要注意的是hadoop要求您从jar启动作业。hadoop需要这样做,所以它可以将您的代码发送到hdfs/jobtracker。
在您的例子中,我假设您没有将作业类绑定到一个jar中,然后“从jar”运行程序,从而生成一个cnfe。
尝试构建一个jar并使用
hadoop jar myjar.jar ...
,则可以在eclipse中测试运行