I'm running Hadoop on a pseudo-distributed. I want to read and write from Local filesystem by Abstracting the HDFS for my job. Am using the file:/// parameter.
I followed this link.
This is the file contents of core-site.xml,
<configuration>
<property>
<name>hadoop.tmp.dir</name>
<value> /home/abimanyu/temp</value>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:54310</value>
</property>
</configuration>
This is the file contents of mapred-site.xml,
<configuration>
<property>
<name>mapred.job.tracker</name>
<value>localhost:54311</value>
</property>
<property>
<name>fs.default.name</name>
<value>file:///</value>
</property>
<property>
<name>mapred.tasktracker.map.tasks.maximum</name>
<value>1</value>
</property>
<property>
<name>mapred.tasktracker.reduce.tasks.maximum</name>
<value>1</value>
</property>
</configuration>
This is the file contents of hdfs-site.xml,
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
This is the error I get when I try to start the demons(using start-dfs or start-all),
localhost: Exception in thread "main" java.lang.IllegalArgumentException: Does not contain a valid host:port authority: file:///
localhost: at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:164)
localhost: at org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:212)
localhost: at org.apache.hadoop.hdfs.server.namenode.NameNode.getAddress(NameNode.java:244)
localhost: at org.apache.hadoop.hdfs.server.namenode.NameNode.getServiceAddress(NameNode.java:236)
localhost: at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:194)
localhost: at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:150)
localhost: at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.main(SecondaryNameNode.java:676)
What is strange to me is that this reading from local file system works completely fine in hadoop-0.20.2 but not in hadoop-1.2.1. Has anything changed from initial release to the later version ? Let me know how to read from Local File system for a Hadoop JAR.
/home/abimanyu/binaries.So i presume this is my HADOOP_HOME. - Learner