I'm stuck with this problem. I'm using Hadoop (CDHu3). I have tried every possible solution, I found by Googling.
This is the issue:
When I ran Hadoop example "wordcount", the tasktracker's log in one slave node gave following errors:
1.WARN org.apache.hadoop.mapred.DefaultTaskController: Task wrapper stderr: bash: /var/tmp/mapred/local/ttprivate/taskTracker/hdfs/jobcache/job_201203131751_0003/attempt_201203131751_0003_m_000006_0/taskjvm.sh: Permission denied
2.WARN org.apache.hadoop.mapred.TaskRunner: attempt_201203131751_0003_m_000006_0 : Child Error java.io.IOException: Task process exit with nonzero status of 126.
3.WARN org.apache.hadoop.mapred.TaskLog: Failed to retrieve stdout log for task: attempt_201203131751_0003_m_000003_0 java.io.FileNotFoundException: /usr/lib/hadoop-0.20/logs/userlogs/job_201203131751_0003/attempt_201203131751_0003_m_000003_0/log.index (No such file or directory)
I could not find similar issues in Google. I got some posts seem a little relevant and which suggest:
- The
ulimit
of Hadoop user: Myulimit
is set large enough for this bundled example - The memory used by JVM: My JVM uses only Xmx200m, too small to exceed the limit of my machine
- The privilege of the mapred.local.dir and logs dir: I set them by "chmod 777"
- The disk space is full: There is enough space for Hadoop in my log directory and mapred.local.dir.
How can I solve this problem?