1
votes

I configured Hive with mySQL as my metastore. I can enter hive shell and create tables successfully.

Spark version: 2.4.0

Hive version: 3.1.1

When I try to run a SparkSQL program using spark submit, I'm getting the below error.

2019-03-02 15:43:41 WARN  HiveMetaStore:622 - Retrying creating default database after error: Error creating transactional connection factory
javax.jdo.JDOFatalInternalException: Error creating transactional connection factory
......
......
Exception in thread "main" org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient;
......
......
org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient;
Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "HikariCP" plugin to create a ConnectionPool gave an error : The connection pool plugin of type "HikariCP" was not found in the CLASSPATH!

Please let me know if anyone can help me in this regard.

1

1 Answers

2
votes

I don't know if you have already solved this problem. There is my advice.

the default database connection is HikariCP in the hive-site.xml. You can search for this in the hive-site.xml: datanucleus.connectionPoolingType. The value is HikariCP. So you need to change it to dbcp since you use Mysql as your metastore.

And at last, don't forget about adding the mysql-connector-java-5.x.x.jar to the path like /home/hadoop/spark-2.3.0-bin-hadoop2.7/jars