我的Map器函数如下所示:
public class preprocessMapper
extends Mapper<LongWritable, Text, Text, Text> {
private String Heading = "";
private String para ="";
private Integer record = 0;
private String word;
public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException {
String fileName = ((FileSplit) context.getInputSplit()).getPath().getName();
String date = fileName.substring(5,15);
Text t1 = new Text(date);
context.write(t1, value);
}}
我的减速机函数如下所示:
public class preprocessReducer
extends Reducer<Text, Text, Text, Text> {
// private IntWritable result = new IntWritable();
public void reduce(Text key, Iterable<Text> values,Context context) throws IOException, InterruptedException {
String para = "";
for (Text val : values) {
para = para + val+" ";
}
Text t2 = new Text(para);
//result.set(para);
context.write(key, t2);
}
}
下面是我的配置函数:
public class preprocess {
public static void main(String[] args) throws Exception {
Configuration conf = new Configuration();
Job job = Job.getInstance(conf, "Preprocess");
job.setJarByClass(preprocess.class);
job.setMapperClass(preprocessMapper.class);
job.setReducerClass(preprocessReducer.class);
job.setNumReduceTasks(1);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
//job.setInputFormatClass(Text.class);
FileInputFormat.addInputPath(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}
`我试图处理一组文本文件使用上述,但我得到了以下有关格式的错误。有人能告诉我格式哪里不正确吗?
18/07/18 19:38:09 INFO mapreduce.Job: Task Id :
attempt_1528077494936_5165_m_000001_2, Status : FAILED
Error: java.lang.ClassCastException: org.apache.hadoop.io.LongWritable
cannot be cast to org.apache.hadoop.io.Text
at preprocessMapper.map(preprocessMapper.java:20)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:793)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
暂无答案!
目前还没有任何答案,快来回答吧!