1
votes

I searched a lot but failed to find a solution to this problem. Actually the file I want to access is in HDFS, but not in input path (the path which was input to the map/reduce job). And I want to access it from mapper. The hdfs path specified in the input path is perfectly accessible from mapper but the other hdfs files are not.

INside mapper:-

FileSystem FS1=FileSystem.get(conf);
Path path=new Path(""+FS1.getHomeDirectory());
FSDataInputStream fsdis=FS1.open(path);

RESULTS IN the following ERROR: java.io.IOException : Cannot open filename /user/hadoop

Thanks in advance, Harsh

2
did you check if /user/hadoop has read permissions? - Ravi Bhatt
Yes it has read permissions for all three. - Harsh
/user/hadoop should be a directory, so I don't think FileSystem.open() will function the same as if it were a file. - Matt D

2 Answers

1
votes

I remember using this tutorial to get something similar working. You can give it a try, it has only a few difference tho what you've written but still it might help...

@Edit: ah and I just noticed (after reading the comments) that you are trying to open FS1.getHomeDirectory() and that is a directory. You should point out to a file not a directory, I think (you can check it out in the linked tutorial under "Reading data from a file").

1
votes

can u try this once

try {
    FileSystem fs = FileSystem.get (new Configuration ());
    FileStatus[] status = fs.listStatus (new Path ("hdfs://jp.seka.com:9000/user/jeka/in"));
    for (int i=0;i < status.length;i++) {
       BufferedReader br = new BufferedReader (new InputStreamReader (fs.open (status[i].getPath())));
       String line;
       line = br.readLine();
       while (line != null) {
           System.out.println (line);
           line=br.readLine ();
       }
    }
} catch (Exception e) {
      System.out.println("File not found");
}