In Hadoop, the hdfs dfs -text and hdfs dfs -getmerge commands allow one to easily read contents of compressed files in HDFS from the command-line, including piping to other commands for processing (e.g. wc -l <(hdfs dfs -getmerge /whatever 2>/dev/null)).
Is there a reciprocal for these commands, allowing one to push content to HDFS from the command-line, while supporting the same compression and format features as the aforementioned commands?
hdfs dfs -put will seemingly just make a raw copy of a local file to HDFS, without compression or container format change.
Answers suggesting command-line tools for manipulating such formats and compression algorithms are welcome too. I typically see Snappy-compressed data in CompressedStream's but can't figure how to convert a plain-old text file (one datum per line) into such a file from the command-line. I gave a try at snzip (as suggested in this askubuntu question) as well as this snappy command-line tool but couldn't use either of them to generate Hadoop-friendly Snappy files (or read the contents of Snappy files ingested in HDFS using Apache Flume).