I have a use case to copy the latest generated HDFS file to a linux remote server. I do not want to store intermediate in local file system and then do scp to a remote server.
I am aware of this, but I want to AVOID it (for the obvious reason - having overhead of storing huge file in local fs)
hadoop fs -copyToLocal <src> <dest>
and then scp toMyLinuxFileSystem
Is there a command to directly copy hdfs file to remote linux server?
ssh user@host 'hadoop fs -copyToLocal ...'- OneCricketeer