0
votes

I am trying to get a Spark/Shark cluster up but keep running into the same problem. I have followed the instructions on https://github.com/amplab/shark/wiki/Running-Shark-on-a-Cluster and addressed Hive as stated.

Here are the details, any help would be great.

I already installed the following package:

Spark/Shark 1.0.0

Apache Hadoop 2.4.0

Apache Hive 0.13

Scala 2.9.3

Java 7

I configure ~/spark/conf/spark-env.sh as:

export HADOOP_HOME=/path/to/hadoop/

export HIVE_HOME=/path/to/hive/

export MASTER=spark://xxx.xxx.xxx.xxx:7077

export SPARK_HOME=/path/to/spark

export SPARK_MEM=4g

export HIVE_CONF_DIR=/path/to/hive/conf/

source $SPARK_HOME/conf/spark-env.sh

When start spark with "./spark-withinfo", I get the following errors:

 -hiveconf hive.root.logger=INFO,console

    Starting the Shark Command Line Client

    14/07/07 16:26:57 WARN conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect.  Use hive.hmshandler.retry.* instead

    14/07/07 16:26:57 [main]: WARN conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect.  Use hive.hmshandler.retry.* instead

    Logging initialized using configuration in jar:file:/path/to/hive/lib/hive-exec-0.13.0.jar!/hive-log4j.properties

    14/07/07 16:26:57 [main]: INFO SessionState:

    Logging initialized using configuration in jar:file:/path/to/hive/lib/hive-exec-0.13.0.jar!/hive-log4j.properties

    14/07/07 16:26:57 [main]: INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore

    Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

            at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:344)

            at shark.SharkCliDriver$.main(SharkCliDriver.scala:128)

            at shark.SharkCliDriver.main(SharkCliDriver.scala)

    Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient

            at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1139)

            at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:51)

            at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:61)

            at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2444)

            at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2456)

            at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:338)

            ... 2 more

    Caused by: java.lang.reflect.InvocationTargetException

            at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

            at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

            at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

            at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

            at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1137)

            ... 7 more

    Caused by: java.lang.NoSuchFieldError: METASTOREINTERVAL

            at org.apache.hadoop.hive.metastore.RetryingRawStore.init(RetryingRawStore.java:78)

            at org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:60)

            at org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:71)

            at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:413)

            at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:401)

            at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:439)

            at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:325)

            at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:285)

            at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)

            at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)

            at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4102)

            at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:121)

            ... 12 more

I guess Spark can not find some libs ton connect metastore in Hive, but I have been stacked here for a couple days and don't know how to solve it. BTW, I use MYSQL for hive metadata, and everything works well in hive.

Any help is appreciated. Thanks in advance.

1
Have you added the MySQL metastore details in hive-site.xml?visakh
Yes, metastore works well in hive. The problem happened when I start Shark.eigen
Do you have added the Hive configuration details in Shark? I guess you will have to add a hive-site.xml in Shark lib directory.visakh
I add export HIVE_CONF_DIR=~/path/hive/conf/ in the shark-env.sh file. Is that what you mean?eigen
No. I would suggest copying hive-site.xml to shark/conf?visakh

1 Answers

1
votes

You may need to add mysql connector jar file before you start spark... In my case, I added mysql connector jar like below.

$SPARK_HOME/bin/compute-classpath.sh 

CLASSPATH=$CLASSPATH:/opt/big/hive/lib/mysql-connector-java-5.1.25-bin.jar