2
votes

I am new to Hadoop. Is there a bash command to transfer files from the Hadoop distributed file system to the standard file system on a hadoop node.

I am using Hadoop 2.6.0

I saw another similar question which asks how to do the same in Java: Copying files from HDFS to local file system with JAVA

Can we do it with a simple shell command instead (which runs on a node that is part of the hadoop cluster)?

2
Hmm, this looks like a possible duplicate, but I just checked that the commands in the answers there do not work. Looks like hdfs command works for hadoop 2.6.0 and bin/hadoop fs is deprecated. - Pranjal Mittal

2 Answers

2
votes

hdfs dfs -get /hdfs/path /local/path

hdfs dfs -put /local/path /hdfs/path

0
votes

If you want to pull data down from HDFS to a local directory,you'll need to use the -get or -copyToLocal switches to the hadoop fs command.

hadoop fs -copyToLocal hdfs://path localpath

just call the command in shell scripting.you can do something like below.

for line in awk '/.csv/ {print $2}' /user/hadoop/TempFiles/CLNewFiles.txt;

do

hadoop fs copyToLocal /user/hadoop/TempFiles/$line yourlocalpath

echo "$line file is downloading from hadoop"

done