0
votes

I am running my mapreduce job as java action from Oozie workflow . When i run my mapreduce in my hadoop cluster it runs successfully,but when i run use same jar from Oozie workflow it throw be

This is my workflow .xml

<workflow-app name="HBaseToFileDriver" xmlns="uri:oozie:workflow:0.1">

    <start to="mapReduceAction"/>
        <action name="mapReduceAction">
                <java>
                         <job-tracker>${jobTracker}</job-tracker>
                        <name-node>${nameNode}</name-node>
                        <prepare>
                                <delete path="${outputDir}"/>
                        </prepare>

                        <configuration>
                                <property>
                                        <name>mapred.mapper.new-api</name>
                                        <value>true</value>
                                </property>
                                <property>
                                        <name>mapred.reducer.new-api</name>
                                        <value>true</value>
                                </property>
                                 <property>
                                        <name>oozie.libpath</name>
                                        <value>${appPath}/lib</value>
                                </property>
                                <property>
                                    <name>mapreduce.job.queuename</name>
                                    <value>root.fricadev</value>
                                </property>

                            </configuration>
                                <main-class>com.thomsonretuers.hbase.HBaseToFileDriver</main-class>

                                    <arg>fricadev:FinancialLineItem</arg>


                                <capture-output/>
                </java>
                <ok to="end"/>
                <error to="killJob"/>
        </action>
        <kill name="killJob">
            <message>"Killed job due to error: ${wf:errorMessage(wf:lastErrorNode())}"</message>
        </kill>
    <end name="end" />
</workflow-app>

Below is my exception when i see the logs in the YARN . even though is showing as succeeded but output files are not getting generated .

1
Did you try checking the stackoverflow.com/questions/33829017/…. How did you generate the keytab file , using kutil ? - Deepan Ram
@SUDARSHAN Where do you get this exception? Is it the part of Java action exception? Can you extend the log? - Alex
@DeepanRam yes using kutil .But dont know where to keep generated key tab file in oozie workflow dir . - SUDARSHAN

1 Answers

1
votes

Have you look into Oozie Java Action

IMPORTANT: In order for a Java action to succeed on a secure cluster, it must propagate the Hadoop delegation token like in the following code snippet (this is benign on non-secure clusters):

// propagate delegation related props from launcher job to MR job
if (System.getenv("HADOOP_TOKEN_FILE_LOCATION") != null) {
    jobConf.set("mapreduce.job.credentials.binary", System.getenv("HADOOP_TOKEN_FILE_LOCATION"));
}

You must get HADOOP_TOKEN_FILE_LOCATION from system env variable and set to the property mapreduce.job.credentials.binary.

HADOOP_TOKEN_FILE_LOCATION is set by oozie at runtime.