2
votes

In RedHat test server I installed hadoop 2.7 and I ran Hive ,Pig & Spark with out issues .But when tried to access metastore of Hive from Spark I got errors So I thought of putting hive-site.xml(After extracting 'apache-hive-1.2.1-bin.tar.gz' file I just add $HIVE_HOME to bashrc as per tutorial and everything was working other than this integration with Spark) In apache site I found that I need to put hive-site.xml as metastore configuration I created the file as below

<configuration>
<property>
  <name>javax.jdo.option.ConnectionURL</name>
  <value>jdbc:derby://localhost:1527/metastore_db;create=true</value>
  <description>JDBC connect string for a JDBC metastore</description>
</property>
</configuration>

I put IP as localhost since it is single node machine .After that I am not able to connect to even Hive .It is throwing error

Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)

.... Caused by: javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:derby://localhost:1527/metastore_db;create=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ java.sql.SQLException: No suitable driver found for jdbc:derby://localhost:1527/metastore_db;create=true

There are lot many error log pointing to the same thing . If I remove hive-site.xml from the conf folder hive is working without issues .Can anyone point me to the right path for default metastore configuration Thanks Anoop R

4
You could try to put echo $CLASSPATH command in the hive script. But I have tried every suggestion on stackoverflow etc and get same error. I am going to give up and try Spark or Hbase or something else. Here is .bashrc: export HIVE_HOME=/usr/local/hive/apache-hive-2.1.1-bin export PATH=$HIVE_HOME/bin:$PATH export DERBY_INSTALL=/usr/local/derby/db-derby-10.13.1.1-bin export DERBY_HOME=$DERBY_INSTALL export PATH=$PATH:$DERBY_HOME/bin export HIVE_CONF_DIR=$HIVE_HOME/conf export CLASSPATH=$CLASSPATH:$DERBY_HOME/lib/derby.jar:$DERBY_HOME/lib/derbytools.jar - Walker Rowe

4 Answers

1
votes

Derby is used as an embedded database. try using

jdbc:derby:metastore_db;create=true

as jdbc-url. see also

https://cwiki.apache.org/confluence/display/Hive/AdminManual+MetastoreAdmin#AdminManualMetastoreAdmin-EmbeddedMetastore

To use the metastore fully functional (and by that to be able to access it from different services), try setting up using mysql as described in the document above.

1
votes

As you are setting up an embedded metastore database, use the property below as JDBC URL:

<property>
   <name>javax.jdo.option.ConnectionURL</name>
   <value>jdbc:derby:metastore_db;create=true </value>
   <description>JDBC connect string for a JDBC metastore </description>
</property>
0
votes

I was also facing similar kind of exception while installing hive. The thing which worked for me was to initialize the derby db. I used the following command to solve the problem : command -> Go to $HIVE_HOME/bin and run the command schematool -initSchema -dbType derby . You can follow the link http://www.edureka.co/blog/apache-hive-installation-on-ubuntu

0
votes

It will work if you put derbyclient.jar in lib folder of hive