0
votes

I am trying to run a map-reduce code which is present inside a Hadoop File system.
The conventional way of running it :
hadoop jar [path-to-jar-file-in-local] [main-class] [args]...
[path-to-jar-file-in-local] - My jar file is in HDFS.

1
Any specific reason to keep the jar file in HDFS ? You can try copyToLocal the jar file and then run with the conventional way. - SurjanSRawat
Edge Node Access has been disabled by the client, we only have access to Hadoop Environment. No Unix System permissions have been given. - Arun Veeramani

1 Answers

0
votes

See similar question. All hadoop commands are invoked by the bin/hadoop script. For executing a jar file it uses RunJar class which has following code snippet

int firstArg = 0;
String fileName = args[firstArg++];
File file = new File(fileName);

where fileName points to jarFile.