I have installed Hadoop in linux cluster. When I try to start the server by the command $bin/start-all.sh, I get following errors:
mkdir: cannot create directory `/var/log/hadoop/spuri2': Permission denied
chown: cannot access `/var/log/hadoop/spuri2': No such file or directory
/home/spuri2/spring_2012/Hadoop/hadoop/hadoop-1.0.2/bin/hadoop-daemon.sh: line 136: /var/run/hadoop/hadoop-spuri2-namenode.pid: Permission denied
head: cannot open `/var/log/hadoop/spuri2/hadoop-spuri2-namenode-gpu02.cluster.out' for reading: No such file or directory
localhost: /home/spuri2/.bashrc: line 10: /act/Modules/3.2.6/init/bash: No such file or directory
localhost: mkdir: cannot create directory `/var/log/hadoop/spuri2': Permission denied
localhost: chown: cannot access `/var/log/hadoop/spuri2': No such file or directory
I have configured log directory parameter in conf/hadoop-env.sh to a /tmp directory and also I have configured the "hadoop.tmp.dir" in core-site.xml to /tmp/ directory. Since I do not have access to /var/log directory but still hadoop daemons are trying to write to /var/log directory and failing.
I am wondering why this is happening?