50
votes

I am new to hadoop distributed file system, I have done complete installation of hadoop single node on my machine.but after that when i am going to upload data to hdfs it give an error message Permission Denied.

Message from terminal with command:

hduser@ubuntu:/usr/local/hadoop$ hadoop fs -put /usr/local/input-data/ /input
put: /usr/local/input-data (Permission denied)

hduser@ubuntu:/usr/local/hadoop$ 

After using sudo and adding hduser to sudouser:

hduser@ubuntu:/usr/local/hadoop$ sudo bin/hadoop fs -put /usr/local/input-data/ /inwe
put: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x

hduser@ubuntu:/usr/local/hadoop$ 
6
do you have access rights to the directory - are you using sudo?ali haider
Yes,after using sudo,,,,hduser@ubuntu:/usr/local/hadoop$ sudo bin/hadoop fs -put /usr/local/input-data/ /inwe put: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x hduser@ubuntu:/usr/local/hadoop$Vignesh Prajapati
In my case, it was because I was trying to download files in a location in my filesystem where I did not have permissions.optimist

6 Answers

72
votes

I solved this problem temporary by disabling the dfs permission.By adding below property code to conf/hdfs-site.xml

<property>
  <name>dfs.permissions</name>
  <value>false</value>
</property>
53
votes

I had similar situation and here is my approach which is somewhat different:

 HADOOP_USER_NAME=hdfs hdfs dfs -put /root/MyHadoop/file1.txt /

What you actually do is you read local file in accordance to your local permissions but when placing file on HDFS you are authenticated like user hdfs. You can do this with other ID (beware of real auth schemes configuration but this is usually not a case).

Advantages:

  1. Permissions are kept on HDFS.
  2. You don't need sudo.
  3. You don't need actually appropriate local user 'hdfs' at all.
  4. You don't need to copy anything or change permissions because of previous points.
15
votes

You are experiencing two separate problems here:


hduser@ubuntu:/usr/local/hadoop$ hadoop fs -put /usr/local/input-data/ /input put: /usr/local/input-data (Permission denied)

Here, the user hduser does not have access to the local directory /usr/local/input-data. That is, your local permissions are too restrictive. You should change it.


hduser@ubuntu:/usr/local/hadoop$ sudo bin/hadoop fs -put /usr/local/input-data/ /inwe put: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WRITE, inode="":hduser:supergroup:rwxr-xr-x

Here, the user root (since you are using sudo) does not have access to the HDFS directory /input. As you can see: hduser:supergroup:rwxr-xr-x says only hduser has write access. Hadoop doesn't really respect root as a special user.


To fix this, I suggest you change the permissions on the local data:

sudo chmod -R og+rx /usr/local/input-data/

Then, try the put command again as hduser.

6
votes

I've solved this problem by using following steps

su hdfs
hadoop fs -put /usr/local/input-data/ /input
exit
5
votes

Start a shell as hduser (from root) and run your command

sudo -u hduser bash
hadoop fs -put /usr/local/input-data/ /input

[update] Also note that the hdfs user is the super user and has all r/w privileges.

2
votes

For Hadoop 3.x, if you try to create a file on HDFS when unauthenticated (e.g. user=dr.who) you will get this error.

It is not recommended for systems that need to be secure, however if you'd like to disable file permissions entirely in Hadoop 3 the hdfs-site.xml setting has changed to:

<property>
  <name>dfs.permissions.enabled</name>
  <value>false</value>
</property>

https://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/hdfs-default.xml