I have hortonworks sandbox running in my VM. I have done all the hive-site.xml configurations and placed in Spark/conf file.
I can access HBase using PySpark and create/update tables but when I do the same implementation in scala its giving me the following error:
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:file:/user/hive/warehouse/src is not a directory or unable to create one)
I have changed my permission on ‘hive/warehouse’ folder too but still its giving me the same error.
[root@sandbox ~]# sudo -u hdfs hadoop fs -ls -d /user/hive/warehouse
drwxrwxrwt - hdfs hdfs 0 2015-02-02 09:19 /user/hive/warehouse
My hive-site.xml contains following property
<property>
<name>hive.security.authorization.enabled</name>
<value>false</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
</property>
<property>
<name>java.jdo.option.ConnectionPassword</name>
<value>hive</value>
<description>
</description>
</property>
Thank you very much in advance.