1
votes

The Mapper didn't manage to read a file from multiple directories. Could anyone help? I need to read one file in each mapper. I've added multiple input paths and implemented the custom WholeFileInputFormat, WholeFileRecordReader. In the map method, I don't need the input key. I make sure that each map can read a whole file.

Command line: hadoop jar AutoProduce.jar Autoproduce /input_a /input_b /output I specified two input path----1.input_a; 2.input_b;

Run method snippets:

Job job = new Job(getConf());
job.setInputFormatClass(WholeFileInputFormat.class);
FileInputFormat.setInputPaths(job, new Path(args[0]), new Path(args[1]));
FileOutputFormat.setOutputPath(job, new Path(args[2]));

map method snippets:

public void map(NullWritable key, BytesWritable value, Context context){
    FileSplit fileSplit = (FileSplit) context.getInputSplit();
    System.out.println("Directory :" + fileSplit.getPath().toString());
    ......
}

Custom WholeFileInputFormat:

class WholeFileInputFormat extends FileInputFormat<NullWritable, BytesWritable> {
    @Override
    protected boolean isSplitable(JobContext context, Path file) {
        return false;
    }

    @Override
    public RecordReader<NullWritable, BytesWritable> createRecordReader(
        InputSplit split, TaskAttemptContext context) throws IOException,
        InterruptedException {

        WholeFileRecordReader reader = new WholeFileRecordReader();
        reader.initialize(split, context);
        return reader;
    }
}

Custom WholeFileRecordReader:

class WholeFileRecordReader extends RecordReader<NullWritable, BytesWritable> {
    private FileSplit fileSplit;
    private Configuration conf;
    private BytesWritable value = new BytesWritable();
    private boolean processed = false;

    @Override
    public void initialize(InputSplit split, TaskAttemptContext context)
    throws IOException, InterruptedException {
        this.fileSplit = (FileSplit) split;
        this.conf = context.getConfiguration();
    }

    @Override
    public boolean nextKeyValue() throws IOException, InterruptedException {
        if (!processed) {

            byte[] contents = new byte[(int) fileSplit.getLength()];
            Path file = fileSplit.getPath();
            FileSystem fs = file.getFileSystem(conf);
            FSDataInputStream in = null;
            try {
                in = fs.open(file);
                IOUtils.readFully(in, contents, 0, contents.length);
                value.set(contents, 0, contents.length);
            } finally {
                IOUtils.closeStream(in);
            }
            processed = true;
            return true;
        }
        return false;
    }
    @Override
    public NullWritable getCurrentKey() throws IOException,InterruptedException {
        return NullWritable.get();
    }

    @Override
    public BytesWritable getCurrentValue() throws IOException,InterruptedException {
        return value;
    }

    @Override
    public float getProgress() throws IOException {
        return processed ? 1.0f : 0.0f;
    }

    @Override
    public void close() throws IOException {
        // do nothing
    }
}

PROBLEM:

After setting two input paths, all map tasks read files from only one directory..

Thanks in advance.

1

1 Answers

1
votes

You'll have to use MultipleInputs instead of FileInputFormat in the driver. So your code should be as:

MultipleInputs.addInputPath(job, new Path(args[0]), <Input_Format_Class_1>);
MultipleInputs.addInputPath(job, new Path(args[1]), <Input_Format_Class_2>);
.
.
.
MultipleInputs.addInputPath(job, new Path(args[N-1]), <Input_Format_Class_N>);

So if you want to use WholeFileInputFormat for the first input path and TextInputFormat for the second input path, you'll have to use it the following way:

MultipleInputs.addInputPath(job, new Path(args[0]), WholeFileInputFormat.class);
MultipleInputs.addInputPath(job, new Path(args[1]), TextInputFormat.class);

Hope this works for you!