0
votes

When I compile the hadoop command it ends up with the below error, "java.io.IOException: Type mismatch in key from map: expected org.apache.hadoop.io.Text, received org.apache.hadoop.io.LongWritable"

I changed the datatype from Text to LongWritable in that case I am getting other datatype mismatch.

Main Class:

public class CalculateMaximum {
    public static void main(String [] args) throws IllegalArgumentException, IOException, ClassNotFoundException, InterruptedException{

    Configuration config  = new Configuration();
    Job job = new Job(config);
    job.setJarByClass(CalculateMaximum.class);
    job.setMapperClass(CalculateMapper.class);
    job.setNumReduceTasks(1);
    job.setReducerClass(CalReducer.class);
    job.setOutputKeyClass(Text.class);
    job.setOutputValueClass(IntWritable.class);
    FileInputFormat.addInputPath(job, new Path(args[0]));
    FileOutputFormat.setOutputPath(job, new Path(args[1]));
    FileSystem fs = FileSystem.get(config);
    fs.delete(new Path(args[1]));
    job.waitForCompletion(true);
}
}

Mapper Class:

public class CalculateMapper extends Mapper<LongWritable,Text,Text,IntWritable> {

    public void cal(LongWritable key,Text values,Context context) throws IOException, InterruptedException{
        String row = values.toString();
        String []r1 = row.split(" ");

        //Integer year = Integer.parseInt(row[0]);
        Text yr  = new Text(r1[0]);
        Integer temp = Integer.parseInt(r1[1]);
        IntWritable tp = new IntWritable(temp);
        context.write(yr, tp);
        //context.write(yr, tp);


    }
}

Reducer Class:

public class CalReducer extends Reducer<Text,Iterable<IntWritable>,Text,IntWritable> {

    public void cal(Text key,Iterable<IntWritable> values,Context context) throws IOException, InterruptedException{

        //Iterable<IntWritable> tmps = values;

        //int temp  = tmps.get();

        int max = 0;
        for(IntWritable temp : values){
            if(temp.get() > max){
                max= temp.get();
            }
        context.write(key, new IntWritable(max));   

        }
            }
}

My input data will be like,

1900 39
1900 14
1900 5
1900 11
1901 32
1901 40
1901 29
1901 48

Expected output:

1900 39
1901 48
1
Have you tried setting up the job for the mapper output formats? Right now, you've only set the reducer outputOneCricketeer
Just now tried adding the below two Map output lines in the main class,but still getting the same error. Added line: job.setMapOutputKeyClass(Text.class); job.setMapOutputValueClass(IntWritable.class);karthik s

1 Answers

0
votes

I believe both key and value are int type. Can you try using the IntWritable for key ?