Error: Java io. IOException: bad value class: org apache. hadoop. io. Text is not a MyClass class
•
Java
I have my mapper and reducer as follows But I got some strange exceptions
public static class MyMapper implements Mapper<LongWritable,Text,Info> { @Override public void map(LongWritable key,Text value,OutputCollector<Text,Info> output,Reporter reporter) throws IOException { Text text = new Text("someText") //process output.collect(text,infoObjeject); } } public static class MyReducer implements Reducer<Text,Info,Text> { @Override public void reduce(Text key,Iterator<Info> values,Text> output,Reporter reporter) throws IOException { String value = "xyz" //derived in some way //process output.collect(key,new Text(value)); //exception occurs at this line } } System.out.println("Starting v14 "); JobConf conf = new JobConf(RouteBuilderJob.class); conf.setJobName("xyz"); String jarLocation =ClassUtil.findContainingJar(getClass()); System.out.println("path of jar file = " + jarLocation); conf.setJarByClass(RouteBuilderJob.class); conf.setMapOutputKeyClass(Text.class); conf.setMapOutputValueClass(Info.class); conf.setOutputKeyClass(Text.class); conf.setOutputValueClass(Text.class); //am i missing something here??? conf.setMapperClass(RouteBuilderJob.RouteMapper.class); conf.setCombinerClass(RouteBuilderJob.RouteReducer.class); conf.setReducerClass(RouteBuilderJob.RouteReducer.class); conf.setInputFormat(TextInputFormat.class); conf.setOutputFormat(TextOutputFormat.class); FileInputFormat.setInputPaths(conf,new Path(args[0])); FileOutputFormat.setOutputPath(conf,new Path(args[1])); JobClient.runJob(conf);
I get an exception:
Error: java.io.IOException: wrong value class: class org.apache.hadoop.io.Text is not class com.xyz.mypackage.Info at org.apache.hadoop.mapred.IFile$Writer.append(IFile.java:199) at org.apache.hadoop.mapred.Task$CombineOutputCollector.collect(Task.java:1307) at com.xyz.mypackage.job.MyJob$RouteReducer.reduce(MyJob.java:156) at com.xyz.mypackage.job.MyJob$RouteReducer.reduce(MyJob.java:1)
The internal information object (which implements writable) is serialized using text
@Override public void write(DataOutput out) throws IOException { Gson gson = new Gson(); String searlizedStr = gson.toJson(this); Text.writeString(out,searlizedStr); } @Override public void readFields(DataInput in) throws IOException { String s = Text.readString(in); Gson gson = new Gson(); JsonReader jsonReader = new JsonReader(new StringReader(s)); jsonReader.setLenient(true); Info info = gson.fromJson(jsonReader,Info.class); //set fields using this.somefield = info.getsomefield() }
Solution
Technically, the output type of reduce should be the same as the input type This is necessary if the combiner is used and the output of the combiner is fed to the reducer
The content of this article comes from the network collection of netizens. It is used as a learning reference. The copyright belongs to the original author.
THE END
二维码