I have root access to the server, with installed hadoop and hive. But despite my root privileges, I cannot copy data from file system to hdfs:
root@serv:~# hadoop fs -put flume_test/logs /user
put: Permission denied: user=root, access=WRITE, inode="/user":hdfs:supergroup:drwxr-xr-x
I tried different ways to create table from file flume_test/logs, which is not in hdfs. For example:
CREATE TABLE natlog_orc (
src_local_ip STRING,
time_start STRING,
time_end STRING,
src_global_ip STRING,
protocol STRING,
port_start STRING,
port_end STRING
)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ","
STORED AS TEXTFILE
LOCATION /root/flume_test/logs;
But no one of them aren't working.
So, how I can create external table from file, which isn't located in hdfs? Or what parameter in hadoop settings I can change to get necessary privileges for copying data into hdfs?