0
votes

I need to copy the a file from my filesystem into HDFS, and below is my configuration in hdfs-site.xml. How should I use the "hadoop fs" command to copy a file at /home/text.txt into the HDFS? Should I copy it into the namenode or datanode?

 <configuration>
<property>
   <name>dfs.replication</name>
   <value>1</value>
</property>
 <property>
   <name>dfs.namenode.name.dir</name>
   <value>file:/usr/local/hadoop_store/hdfs/namenode</value>
 </property>
 <property>
   <name>dfs.datanode.data.dir</name>
   <value>file:/usr/local/hadoop_store/hdfs/datanode</value>
 </property>
</configuration>
1

1 Answers

1
votes

What hadoop version do you use? The directory dfs.namenode.name.dir used to storage the meta data, and directory dfs.datanode.data.dir used to storage the real data(e.g. user data).

If you want to upload file to HDFS, you need not to care copy it into the namenode or datanode. Because HDFS will do it. And finally your data will storage on the DataNode.

You can use commond bin/hadoop fs -mkdir /input and bin/hadoop fs copyFromLocal /home/text.txt /input. The first commond is to create the directory "input" on the HDFS, and the second is copy your file to the HDFS.