0
votes

I have root access to the server, with installed hadoop and hive. But despite my root privileges, I cannot copy data from file system to hdfs:

root@serv:~# hadoop fs -put flume_test/logs /user
put: Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x

I tried different ways to create table from file flume_test/logs, which is not in hdfs. For example:

CREATE TABLE natlog_orc (
    src_local_ip STRING,
    time_start STRING,
    time_end STRING,
    src_global_ip STRING,
    protocol STRING,
    port_start STRING,
    port_end STRING
)
ROW FORMAT DELIMITED
    FIELDS TERMINATED BY ","
STORED AS TEXTFILE
LOCATION /root/flume_test/logs;

But no one of them aren't working.

So, how I can create external table from file, which isn't located in hdfs? Or what parameter in hadoop settings I can change to get necessary privileges for copying data into hdfs?

1

1 Answers

1
votes

HDFS permissions are different from your local file system. root does not have super privileges. The error you're getting is because the root user does not have permissions to write to /user. Switch to the hdfs user (run su - hdfs) and create a directory on hdfs to put your data into (e.g. /user/root). Chown the directory to root and you'll be able to upload the data to hdfs.