2
votes

I have setup a VM with ubuntu. It runs hadoop as a single node. Later I installed apache pig on it. apache pig runs great with local mode, but it always prom ERROR 2999: Unexpected internal error. Failed to create DataStorage

I am missing something very obvious. Can someone help me get this running please?

More details: 1. I assume that hadoop is running fine because, I could run MapReduce jobs in python. 2. pig -x local runs as i expect. 3. when i just type pig it gives me following error

Error before Pig is launched
----------------------------
ERROR 2999: Unexpected internal error. Failed to create DataStorage

java.lang.RuntimeException: Failed to create DataStorage
    at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:75)
    at org.apache.pig.backend.hadoop.datastorage.HDataStorage.(HDataStorage.java:58)
    at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:214)
    at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.init(HExecutionEngine.java:134)
    at org.apache.pig.impl.PigContext.connect(PigContext.java:183)
    at org.apache.pig.PigServer.(PigServer.java:226)
    at org.apache.pig.PigServer.(PigServer.java:215)
    at org.apache.pig.tools.grunt.Grunt.(Grunt.java:55)
    at org.apache.pig.Main.run(Main.java:452)
    at org.apache.pig.Main.main(Main.java:107)
Caused by: java.io.IOException: Call to localhost/127.0.0.1:54310 failed on local exception: java.io.EOFException
    at org.apache.hadoop.ipc.Client.wrapException(Client.java:775)
    at org.apache.hadoop.ipc.Client.call(Client.java:743)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
    at $Proxy0.getProtocolVersion(Unknown Source)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:359)
    at org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:106)
    at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:207)
    at org.apache.hadoop.hdfs.DFSClient.(DFSClient.java:170)
    at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:82)
    at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1378)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:66)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1390)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:196)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:95)
    at org.apache.pig.backend.hadoop.datastorage.HDataStorage.init(HDataStorage.java:72)
    ... 9 more
Caused by: java.io.EOFException
    at java.io.DataInputStream.readInt(DataInputStream.java:375)
    at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:501)
    at org.apache.hadoop.ipc.Client$Connection.run(Client.java:446)
================================================================================
3
have you defined the correct hadoop environment variables so that pig can find the configs?sdolgy
Yes i did. Could this be because of user permissions? Any idea?vrrathod

3 Answers

3
votes

Link helped me understand possible cause of failure.

Here is what fixed my problem.
1. Recompile pig without hadoop.
2. Update PIG_CLASSPATH to have all the jars from $HADOOP_HOME/lib
3. Run pig.

Thanks.

0
votes

set your PIG_CLASSPATH to point to your correct HADOOP_HOME installation so that Pig can pick up ur cluster information from core-site.xml,mapreduce-site.xml and hdfs-site.xml,better to follow the link for correct installation.

-1
votes

Just install Cygwin, then add the Cygwin path to the Path Environment Variable:

For details see here.