3
votes

I am successfully run hive on hadoop using default database.Now I want to connect Hive through mysql(mysql installed on port 3306).

Steps: 1) create hive-site.xml and paste property

  <property>
  <name>hive.metastore.local</name>
  <value>true</value>
  </property>

  <property>
  <name>javax.jdo.option.ConnectionURL</name>  
  <value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value>
  </property>

  <property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>com.mysql.jdbc.Driver</value>
  </property>

  <property>
  <name>javax.jdo.option.ConnectionUserName</name>
  <value>hadoop</value>
  </property>

  <property>
  <name>javax.jdo.option.ConnectionPassword</name>
  <value>hadoop</value>
  </property>

2) Paste connector in hive lib folder.

3) Than check using my mysql but no database created in hive warehouse.

I have no error but still database not created.Please suggest if have any solution.

edit

Error in log file

ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.
ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.
ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.
ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.
ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.
ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.
1
What's there in your log file?Is it still using derby?Tariq
hive.log file(it should be inside /tmp, if you haven't changed it)Tariq
I don't understand please explore which /tmp folder and log files.ruchi
Hive has a log file called hive.log. All the logs get stored in this file right from the time you start Hive. It's default location is /tmp/your_user_name/ directory. If an error occurs that get logged into this file. If you look at it you'll get some idea about the cause of the error. And this /tmp directory is present under the root directory(/) of your machine(in your local filesystem).Tariq
yes , You are right I have some error in log files also edit question. please see.ruchi

1 Answers

0
votes

Hive has a limitation:

the database 'hive' you are using in the following is only to store Hive metadata info in the mysql dbms

 1. jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true

This does not mean Hive will use the same name in the HDFS for the data warehouse. The limitation is that Hive only supports the name 'default' only as the db name.

JDBC connection string for Hive (this should make things a little clear for you)

2. jdbc:hive://localhost:10000/default

Check the 10000/default part and compare (1) and (2)

I'm not sure if this has changed as of yet.