1
votes

I am trying to run the wordcount example in c++, on Hadoop 1.0.4, on Ubuntu 12.04, but I am getting the following error:

Command:

hadoop pipes -D hadoop.pipes.java.recordreader=true -D hadoop.pipes.java.recordwriter=true -input bin/input.txt -output bin/output.txt -program bin/wordcount.

Error message:

13/06/14 13:50:11 WARN mapred.JobClient: No job jar file set. User classes may not be found. See JobConf(Class) or JobConf#setJar(String). 13/06/14 13:50:11 INFO util.NativeCodeLoader: Loaded the native-hadoop library 13/06/14 13:50:11 WARN snappy.LoadSnappy: Snappy native library not loaded 13/06/14 13:50:11 INFO mapred.FileInputFormat: Total input paths to process : 1 13/06/14 13:50:11 INFO mapred.JobClient: Running job: job_201306141334_0003 13/06/14 13:50:12 INFO mapred.JobClient: map 0% reduce 0% 13/06/14 13:50:24 INFO mapred.JobClient: Task Id : attempt_201306141334_0003_m_000000_0, Status : FAILED java.io.IOException at org.apache.hadoop.mapred.pipes.OutputHandler.waitForAuthentication(OutputHandler.java:188) at org.apache.hadoop.mapred.pipes.Application.waitForAuthentication(Application.java:194) at org.apache.hadoop.mapred.pipes.Application.(Application.java:149) at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:71) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) at org.apache.hadoop.mapred.Child.main(Child.java:249) attempt_201306141334_0003_m_000000_0: Server failed to authenticate. Exiting 13/06/14 13:50:24 INFO mapred.JobClient: Task Id : attempt_201306141334_0003_m_000001_0, Status : FAILED

I didn't find any solution and i've been trying for quite a while to make it work.

I appreciate your help, Thanks.

1

1 Answers

0
votes

Found this SO question (hadoop not running in the multinode cluster) where that user got similar errors and it ended up being that they did not "Set a class" according to the top answer. This was Java however.

I found this tutorial about running the C++ wordcount example in Hadoop. Hopefully this helps you out. http://cs.smith.edu/dftwiki/index.php/Hadoop_Tutorial_2.2_--_Running_C%2B%2B_Programs_on_Hadoop