3
votes

I'm getting below errors; java.lang.IllegalArgumentException: Wrong FS: hdfs://0.0.0.0:9000/user/hadoop/dataset/data.log, expected: file:///

Code snipset



    Configuration conf = new Configuration();
    conf.addResource(new Path("/home/hadoo/hadoop-2.5.2/etc/hadoop/core-site.xml"));
    conf.addResource(new Path("/home/hadoo/hadoop-2.5.2/etc/hadoop/hdfs-site.xml"));
    FileSystem fs = FileSystem.get(conf);



    Path path = new Path("hdfs://0.0.0.0:9000/user/hadoop/dataset/data.log");
    try {
      if (fs.exist(path)) {
        return true;
      } else {
        return false;
    } catch (IOException e) {
      e.printStackTrace();
    }

But, if I remove the prefix "hdfs://0.0.0.0:9000" from the path, it's OK. Can you please guide me how to resolve this issue w/o changing the hdfs path? FYI, Hadoop server and client are running on different machine. ie) Hadoop(HDFS) : 172.xx.xx.247 My test client : 172.xx.xx.236

core-site.xml

<configuration>
 <property>
  <name>fs.default.name</name>
  <value>hdfs://172.xx.xx.247:9000</value>
 </property>
</configuration>

hdfs-site.xml

<configuration>
 <property>
  <name>dfs.replication</name>
  <value>1</value>
 </property>
 <property>
  <name>dfs.name.dir</name>
  <value>file:///home/hadoop/hadoopdata/hdfs/namenode</value>
 </property>
 <property>
  <name>dfs.data.dir</name>
  <value>file:///home/hadoop/hadoopdata/hdfs/datanode</value>
 </property>
</configuration>

core-default.xml

<property>
 <name>fs.file.impl</name>
 <value>org.apache.hadoop.fs.LocalFileSystem</value>
</property>
<property>
 <name>fs.hdfs.impl</name>
 <value>org.apache.hadoop.hdfs.DistributedFileSystem</value>
</property>
1

1 Answers

3
votes

You have a spelling error in your code.

Your code: conf.addResource(new Path("/home/hadoo/hadoop-2.5.2/etc/hadoop/core-site.xml"));

Try: conf.addResource(new Path("/home/hadoop/hadoop-2.5.2/etc/hadoop/core-site.xml"));