2
votes

I am using Ubuntu 12.04, hadoop-0.23.5, hive-0.9.0. I specified my metastore_db separately to some other place $HIVE_HOME/my_db/metastore_db in hive-site.xml

Hadoop runs fine, jps gives ResourceManager,NameNode,DataNode,NodeManager,SecondaryNameNode

Hive gets started perfectly,metastore_db & derby.log also created,and all hive commands run successfully,I can create databases,table,etc. But after few day later,when I run show databases,or show tables, get below error

FAILED: Error in metadata: MetaException(message:Got exception:  java.net.ConnectException Call to localhost/127.0.0.1:54310 failed on connection exception: java.net.ConnectException: Connection refused) FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
4

4 Answers

3
votes

I had this problem too and the accepted answer did not help me so will add my solution here for others:

My problem was I had a single machine with a pseudo distributed set up installed with hive. It was working fine with localhost as the host name. However when we decided to add multiple machines to the cluster we also decided to give the machines proper names "machine01, machine 02 etc etc".

I changed all the hadoop conf/*-site.xml files and the hive-site.xml file too but still had the error. After exhaustive research I realized that in the metastore db hive was picking up the URIs not from *-site files, but from the metastore tables in mysql. Where all the hive table meta data was saved are two tables SDS and DBS. Upon changing the DB_LOCATION_URI column and LOCATION in the tables DBS and SDS respectively to point to the latest namenode URI, I was back in business.

Hope this helps others.

0
votes

reasons for this

  1. If you changed your Hadoop/Hive version,you may be specifying previous hadoop version (which has ds.default.name=hdfs://localhost:54310 in core-site.xml) in your hive-0.9.0/conf/hive-env.sh file
  2. $HADOOP_HOME may be point to some other location
  3. Specified version of Hadoop is not working
  4. your namenode may be in safe mode ,run bin/hdfs dfsadmin -safemode leave or bin/hadoop dsfadmin -safemode leave
0
votes

In case of fresh installation
the above problem can be the effect of a name node issue

try formatting the namenode using the command

hadoop namenode -format
0
votes

1.Turn off your namenode from safe mode. Try the commands below:

hadoop dfsadmin -safemode leave

2.Restart your Hadoop daemons:

sudo service hadoop-master stop

sudo service hadoop-master start