I want to create directory for each user.
I looked at several how-to's and they say different things, I want it to be as easy as possible (I don't care about the encryption, as users will login to the machine using their ssh keys)
I've found this small guide: hadoop user file permissions
But have few questions,
Do I need to create directories and users on each slave/node machine too?
What is /user/myuser folder exactly? Is it supposed to be the /opt/hadoop/dfs/name/data (
dfs.data.dir
) folder in the $HADOOP_HOME/etc/hadoop/hdfs-site.xml file?Do I also need to give/create a
dfs.name.dir
dir for each user?
After I create the users and directory, do I need to put some params in user's .bashrc file or give them specific permissions to use the hadoop commands? (put/delete files for example, create dirs...)
Anything else I forgot?
P.S My Hadoop works with sparks, if that matters.