I'm new to hive and trying to set it up in a relatively secure manner for a test environment. I want to use a remote metastore so MR jobs can access the DB. I seem to have things almost working, but when a user with a credential tries to create a database I get:
hive> show databases;
OK
default
hive> create database testdb;
FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException User: hdfs/[email protected] is not allowed to impersonate [email protected])
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
I can run 'show databases' ok. I have "hive --service metastore" running as hdfs with hdfs/[email protected] as the principal. I'm running hive as "myuserid" on the same box. I don't know if it's related, but if I try to run hive from another system I get a GSS Initiate error unless I use the same principal (hdfs/[email protected]) for hive.metastore.kerberos.principal. Is that expected?
When I try googling this I see similar issues, but they message about not being able to impersonate only shows the single part user name where for me it's showing the realm. I tried playing with the auth_to_local property, but it didn't help. Map Reduce and HDFS operations are working fine.
In core-site.xml I have:
<property>
<name>hadoop.proxyuser.hdfs.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.hdfs.groups</name>
<value>*</value>
</property>
In hive-site.xml I have:
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost/metastore</value>
<description>the URL of the MySQL database</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>password</value>
</property>
<property>
<name>datanucleus.autoCreateSchema</name>
<value>false</value>
</property>
<property>
<name>datanucleus.fixedDatastore</name>
<value>true</value>
</property>
<property>
<name>hive.metastore.uris</name>
<value>thrift://hadoopserver.sub.dom.com:9083</value>
</property>
<property>
<name>hive.security.authorization.enabled</name>
<value>true</value>
</property>
<property>
<name>hive.metastore.sasl.enabled</name>
<value>true</value>
</property>
<property>
<name>hive.metastore.kerberos.keytab.file</name>
<value>/etc/hadoop/hdfs.keytab</value>
</property>
<property>
<name>hive.metastore.kerberos.principal</name>
<value>hdfs/[email protected]</value>
</property>
<property>
<name>hive.metastore.execute.setugi</name>
<value>true</value>
</property>
Any ideas?