1
votes

I have created a database in HIVE using command CREATE DATABASE FIRST_DB; and a database is created.

Then I created a few tables in to it and it was working find. Directory named FIRST_DB.db is created in my hdfs warehouse location. Then I quit my HIVE shell.

Next day when I started my HIVE and tried to connect using command USE FIRST_DB; then it gave an error:

SemanticException [error 10072] :database does not exist FIRST_DB

but when I checked into my hdfs the FIRST_DB.db is present and tables under that is also present. Please help me to set this database permanently even after I quit HIVE session. Let me know if there is any configuration I missed.

4

4 Answers

2
votes

Can you check your hive.metastore.uris variable. May be hive session is using your derby metastore, which is by default created in local directory instead of using the shared metastore.

Looking like the problem is with your configuration (Hive Metastore). Can you check is the variable set as "embedded" or "Remote Metastore". Embedded is the default one,so try to change it as "Remote Metastore".

1
votes

Make sure that the hive-site.conf is there in the spark config directory. I checked using hive command line and everything worked fine, but same things were failing from the sqlContext in spark-shell. Turned out there was no hive-site.conf in the spark config folder.

Had to create a symbolic link, note that the folder locations may be different in your case.

sudo ln -s /etc/hive/conf.dist/hive-site.xml  /etc/spark/conf.dist/hive-site.xml

The link here was helpful for me https://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/how-to-access-the-hive-tables-from-spark-shell/td-p/36609

-1
votes

@stacey thanks for your help. I solved this problem by creating a new database of same name i.e FIRST_DB,immediately after that old tables were not displayed by using SHOW TABLES command, so I restarted my hive, hadoop and Ubuntu machine. After restart it was able to connect to my old database and referring to old tables data.

-1
votes

This solved the issue --

sudo ln -s /etc/hive/conf.dist/hive-site.xml  /etc/spark/conf.dist/hive-site.xml

I exited the spark-shell and re-launched it and I was able to see the hive databases using spark-shell.