I have installed apache zeppelin by downloading and extracting the binary with all interpreters
I then started it up with:
./bin/zeppelin.sh start
I then created a new notebook with the following code:
%sh
hdfs fs -ls
When I run it I get the I get as a result:
bash: hdfs: command not found
ExitValue: 127
isn't zeppelin supposed to come with hdfs interpreter or support hdfs commands?
If zeppelin does not include hdfs file system then I started up:
https://github.com/big-data-europe/docker-hadoop-spark-workbench
Then started it up with docker-compose up . I navigated to the various urls mentioned in the git readme and all appears to be up. Which I think means I have some hdfs is up but i'm not sure whether I should have done that or the "all" interpreters package from zeppelin already includes it, in anyway i got the same result either way.
My end goal is simply local playground of hdfs together with spark with the help of zeppelin.
How am I supposed to run hdfs commands such as hdfs fs -ls with a local installation of apache zeppelin that includes all interpreters? Does it include hdfs and hdfs commands?