1
votes

I have Hadoop version 2.6.3 and pig-0.6.0 I have all the daemons up and running in Single node cluster. After firing the pig command . The pig is only connecting to file:/// not hdfs could you please tell me how to make it to connect hdfs below is the INFO log that could i see

2016-01-10 20:58:30,431 [main] INFO  org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: file:///
2016-01-10 20:58:30,650 [main] INFO  org.apache.hadoop.metrics.jvm.JvmMetrics - Initializing JVM Metrics with processName=JobTracker, sessionId=

when I hit the command in GRUNT

grunt> ls  hdfs://localhost:54310/  
2016-01-10 21:05:41,059 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2999: Unexpected internal error. Wrong FS: hdfs://localhost:54310/, expected: file:///
Details at logfile: /home/hguna/pig_1452488310172.log

I have no clue has to why it is expecting file:///

ERROR 2999: Unexpected internal error. Wrong FS: hdfs://localhost:54310/, expected: file:///

java.lang.IllegalArgumentException: Wrong FS: hdfs://localhost:54310/, expected: file:///
        at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:305)
        at org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:47)
        at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:357)
        at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:643)
        at org.apache.pig.backend.hadoop.datastorage.HDataStorage.isContainer(HDataStorage.java:203)
        at org.apache.pig.backend.hadoop.datastorage.HDataStorage.asElement(HDataStorage.java:131)
        at org.apache.pig.tools.grunt.GruntParser.processLS(GruntParser.java:576)
        at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:304)
        at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:168)
        at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:144)
        at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:75)
        at org.apache.pig.Main.main(Main.java:352)

Did I configure hadoop correctly ? or some where I am wrong please let me know if there is any file that I need to share . I have done enough researching could not fix it .Btw I am a newbie to Hadoop and pig please help me . Thanks

1
Note : Happen to notice in my ~/.bashrc file that I did not include PIG_CLASSPATH = /usr/lib/hadoop/conf . Could that be a issue folks . - Matt
What's your default file system? - Manjunath Ballur

1 Answers

1
votes

Chek your configuration in hadoop-site.xml, core-site.xml and mapred-site.xml

Use PIG_CLASSPATH to specify addition classpath entries. For eg, to add hadoop configuration files (hadoop-site.xml, core-site.xml) to classpath

export PIG_CLASSPATH=<path_to_hadoop_conf_dir>

you should override default classpath entries by setting PIG_USER_CLASSPATH_FIRST

export PIG_USER_CLASSPATH_FIRST=true

After that you can able to start the grunt shell