0
votes

Here is my workflow.xml

<action name="hive-node">
<hive xmlns="uri:oozie:hive-action:0.2">
 <job-tracker>${jobTracker}</job-tracker>
    <name-node>${nameNode}</name-node>
 <prepare>
        <delete path="${nameNode}/user/${wf:user()}/${wfeRoot}/output-data/hive"/>
        <mkdir path="${nameNode}/user/${wf:user()}/${wfeRoot}/output-data"/>
    </prepare>
 <job-xml>hive-site.xml</job-xml>
    <configuration>
        <property>
            <name>mapred.job.queue.name</name>
            <value>${queueName}</value>
        </property>
  <property>
            <name>oozie.log.hive.level</name>
            <value>DEBUG</value>
        </property>
        <property>
            <name>oozie.hive.defaults</name>
            <value>hive-default.xml</value>
        </property>
    </configuration>
    <script>script.q</script>
</hive>
<ok to="end"/>
<error to="fail"/>

Hive failed, error message[${wf:errorMessage(wf:lastErrorNode())}]

my job.properties file

 nameNode=hdfs://localhost:8020
 jobTracker=localhost:8021
 queueName=default
 wfeRoot=wfe

 oozie.use.system.libpath=true
 oozie.libpath=/user/oozie/share/lib/hive

   oozie.wf.application.path=${nameNode}/user/${user.name}/${wfeRoot}/hiveoozie

Script

create table brundesh(name string,lname string) row format delimited fields    terminated by ',';

I copied hive-site.xml ,script.hql and hive-default.xml in to oozie app directory. I am using cdh3

Error detalis:
Error code: JA018
Error Message: Main Class[org.apache.oozie.action.hadoop.HiveMain],exit code [9]

I copied the required jar files to sharelib directory in hdfs. I copied all the jar fiels present in oozie.sharelib.tar.gz from $OOZIE_HOME

I goggled for error but no luck. Please help me were am going wrong

1
This error is not the real error - it simply says that something went wrong in Hive. You should look in Hive logs for the real error. Afraid I can't help much without that. As an aside, are you sure you want to delete and create a folder when the Hive action is just creating a table? The two things seem completely unrelated. Maybe look at using a separate FS action. - Ben Watson

1 Answers

0
votes

As mention by Ben Please check Hive Log, which present in the respected Node or Check with in the console URL with details of the Logs.

Will also suggest to do another steps which requried to perform are:

  1. Take a Backup of Shared Lib Jar from the DFS Location.

  2. Upload the same Jars from local Hive Lib Location to DFS Shared Location with Oozie User.

  3. Make Sure there should not be any Duplicate Hive Jar present in other Local Location except Hive Lib Path.

  4. All Nods should be having the same Jars.

  5. If you are using Pig as well, then please perform the Step 1, Step 2 , Step 3 from Pig as well.

  6. Check the Hadoop ClassPath if there Classpath have been set properly.