1
votes

I'm getting the following error when executing a select count(*) from tablename query when connected to beeline.

ERROR : Job Submission failed with exception 'org.apache.hadoop.security.AccessControlException(Permission denied
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkOwner(FSPermissionChecker.java:201)

I can execute showtables; successfully but get this error anytime I execute a query. I am logged in as the hadoop user that has access to both hadoop and hive.

I've granted the folder where the tables resides full permissions:

drwxr-xr-x   - hadoop supergroup          0 2015-06-03 15:44 /data1
drwxrwxrwx   - hadoop hadoop              0 2015-06-05 15:23 /tmp
drwxrwxrwx   - hadoop supergroup          0 2015-06-05 15:24 /user

The table is in the user directory. Environment details: OS: CentOS Hadoop: HW 2.6.0 Hive: 1.2

Any help would be greatly appreciated.

1
Post the result for hdfs dfs -lsr / in your question. - Rajesh N

1 Answers

1
votes

Is this a hive managed table in that case could you print what you get when you do

hadoop fs -ls /user
hadoop fs -ls /user/hive
hadoop fs -ls /user/hive/warehouse

the error suggests that you are accessing a table from a user who is not the owner and seems like user does not have read and execute access