0
votes

My Query : I have installed hive on ubantu . i tested it before was working file but later it start giving error .

insert into test2 values (1,'Mahendra');

My Hive Error :

Query ID = mahendra_20180827145546_86973630-5eff-4764-ade8-cfc3a8ce5c37
Total jobs = 3
Launching Job 1 out of 3
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
set hive.exec.reducers.bytes.per.reducer=
In order to limit the maximum number of reducers: set hive.exec.reducers.max= In order to set a constant number of reducers: set mapreduce.job.reduces= Starting Job = job_1535360274908_0006, Tracking URL = http://mahendra-system:8088/proxy/application_1535360274908_0006/ Kill Command = /home/mahendra/HDEcho/hadoop-3.0.3/bin/mapred job -kill job_1535360274908_0006
Hadoop job information for Stage-1: number of mappers: 0; number of reducers: 0
2018-08-27 14:55:51,645 Stage-1 map = 0%, reduce = 0% Ended Job = job_1535360274908_0006 with errors Error during job, obtaining debugging information...
FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask MapReduce Jobs Launched:
Stage-Stage-1: HDFS Read: 0 HDFS Write: 0 FAIL Total MapReduce CPU Time Spent: 0 msec

My yarn Error :

Error: Could not find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster

My Dignostics :

 Error: Could not find or load main class org.apache.hadoop.mapreduce.v2.app.MRAppMaster
Please check whether your etc/hadoop/mapred-site.xml contains the below configuration:
<property>
<name>yarn.app.mapreduce.am.env</name>
<value>HADOOP_MAPRED_HOME=${full path of your hadoop distribution directory}</value>
</property>
<property>
<name>mapreduce.map.env</name>
<value>HADOOP_MAPRED_HOME=${full path of your hadoop distribution directory}</value>
</property>
<property>
<name>mapreduce.reduce.env</name>
<value>HADOOP_MAPRED_HOME=${full path of your hadoop distribution directory}</value>
</property> 

My MapRed-site.xml :

<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
    <property>
        <name>mapreduce.application.classpath</name>
        <value>/home/mahendra/HDEcho/hadoop-3.0.3/share/hadoop/mapreduce/*:/home/mahendra/HDEcho/hadoop-3.0.3/share/hadoop/mapreduce/lib/*</value>
    </property>
<property>
<name>yarn.app.mapreduce.am.env</name>
<value>HADOOP_MAPRED_HOME=/home/mahendra/HDEcho/hadoop-3.0.3</value>
</property>
<property>
<name>mapreduce.map.env</name>
<value>HADOOP_MAPRED_HOME=/home/mahendra/HDEcho/hadoop-3.0.3</value>
</property>
<property>
<name>mapreduce.reduce.env</name>
<value>HADOOP_MAPRED_HOME=/home/mahendra/HDEcho/hadoop-3.0.3</value>
</property>
</configuration>
2

2 Answers

2
votes
export YARN_HOME=$HADOOP_HOME      

In bashrc worked for me

1
votes

I faced the same issue(JAVA -1.8 291u ,Hadoop -2.8.0) , which was resolved after setting up the property - YARN application classpath in yarn-site.xml:

Step 1: execute hadoop classpath. This command displays the list of path to be passed as a value in yarn-site.xml

Step 2: Edit the yarn-site.xml as below:

 <property>
    <name>yarn.application.classpath</name>
    <value>output from step1 </value>
 </property>

Restart the Yarn again before triggering the mapreduce job.