0
votes

I'd like to copy whole local directory with some subdirectories and files to HDFS. HDFS already contains the root directory and some subdirectories with files. I just want to add newer files from local directory.

Local directory /www/hitlogfetcher/logs/:
day=20--hour=00/files.....
  |-hour=01/files.....
  |-hour=02/files....

HDFS /hitlogfetcher-test/:
day==20--hour=00/files
   |-hour=01/

When I used command: hadoop dfs -copyFromLocal /www/hitlogfetcher/logs/* /hitlogfetcher-test/ I received error message:

Target /hitlogfetcher-test/day=20 is a directory
  • day=20 is a directory that contains some subdirectories and files

So I would like to copy files from directory hour=01 and then cpy directory hour=02 and its files.

Is it possible by using hadoop shell commands or another way?

1

1 Answers

0
votes

The copyFromLocal command will upload directories recursively by default, so you don't need the "*":

hadoop dfs -copyFromLocal /www/hitlogfetcher/logs/  /hitlogfetcher-test/
                                                  ^