0
votes

I'm trying to open a file to pass some parameters read in the file to the job MapReduce. This code works in local mode, but when i try to attack HDFS it doesn't work.

This is my code:

Path tmpPath = new Path(tmpFile);
    try {
        InputStream ips = new FileInputStream(tmpFile);
        InputStreamReader ipsr = new InputStreamReader(ips);
        BufferedReader br = new BufferedReader(ipsr);

        String[] minMax = br.readLine().split("-");
        min = minMax[0];
        max = minMax[1];
        br.close();
    } catch (Exception e) {
        System.out.println(e.toString());
        System.exit(-1);
    }

This is the code error that appears:

"java.io.FileNotFoundException: hdfs:/quickstart.cloudera:8020/user/cloudera/dataOut/tmp/part-r-00000 (No such file or directory)"

This is the place where i write the file in the previous job:

    Path tmp = new Path("dataOut/tmp");
    FileOutputFormat.setOutputPath(job, tmp);

As MapReduce job, this will write the file part-r-00000.

Probably all of you will say, "Try with Distributed cache". I've already tried, but i'm newbie with Java, Hadoop and MapReduce. And i couldnt make it work...

Thanks

2
did you check whether that file is present in that path in hdfs ?alekya reddy
Yes, i already checked. It's there.Sebastian Loeb Sucre

2 Answers

0
votes

Looking at your error code "java.io.FileNotFoundException: hdfs:/quickstart.cloudera:8020/user/cloudera/dataOut/tmp/part-r-00000 (No such file or directory)"

It seems that your output path is not present in the given directory. Try running the following command to check whether you are able to excess the path or not.

hadoop fs -text hdfs:/quickstart.cloudera:8020/user/cloudera/dataOut/tmp/part-r-00000

0
votes

I finally got it. I used this code:

    Configuration conf = new Configuration();
    Path file = new Path(DEFAULT_FS + "/user/cloudera/dataOut/tmp/part-r-00000");
    FileSystem hdfs = FileSystem.get(file.toUri(), conf);
    FSDataInputStream in = hdfs.open(file);
    byte[] content = new byte[(int) hdfs.getFileStatus(file).getLen()];
    in.readFully(content);
    String maxMin = new String(content);