0
votes

Hey I am installing HIVE in a Hadoop 2.0 Multi Node cluster ,and I am not able to Create folder using this command :

[hadoop@master ~]$ $HADOOP_HOME/bin/hadoop fs -mkdir /tmp

16/07/19 14:20:15 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

[hadoop@master ~]$ $HADOOP_HOME/bin/hadoop fs -mkdir -p /user/hive/warehouse 16/07/19 14:24:12 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Importantly I am not able to find the Created Folder ? Where it will go and create I am not sure. Please help.

JPS for Hadoop is working fine:
[hadoop@master ~]$ jps

2977 ResourceManager  
2613 DataNode  
3093 NodeManager  
2822 SecondaryNameNode  
2502 NameNode  
5642 Jps
1
So $ hadoop fs -ls / doesn't show /tmp on HDFS?Binary Nerd
Yes now it shows up the below but "Super Group" ? 'drwxr-xr-x - hadoop supergroup 0 2016-07-19 13:37 /apple drwxr-xr-x - hadoop supergroup 0 2016-07-19 13:42 /steve_jobs drwxrwxr-x - hadoop supergroup 0 2016-07-18 12:51 /tmp drwxr-xr-x - hadoop supergroup 0 2016-07-19 13:22 /user drwxr-xr-x - hadoop supergroup 0 2016-07-19 15:50 /usr'Vinay Ram
Ok, so its been created. Its owned by hadoop and is in the supergroup group. What's your question?Binary Nerd
Why am I not able to see this Physically ? and I want to Change the UserGroup To hadoopVinay Ram
What do you mean physically? You access HDFS via the hadoop CLI. You can change the group using something like - $ hadoop fs -chown hadoop:hadoop /tmpBinary Nerd

1 Answers

1
votes

The warning you are getting after running -mkdir command does not impact the Hadoop functionality. It's just a warning, just ignore it. See here for details.

About creating directories under root i.e. "/", it is just one-time activity and should be done by superuser. Once you create the root directories like "/tmp", "/user" etc., then you can create user specific foders like "/user/hduser" and own them using commands:

sudo -u hdfs hdfs dfs -mkdir /tmp

OR

sudo -u hdfs hdfs dfs -mkdir -p /user/hive/warehouse

Once you have the main folder ready, just own it with the user who will be using it:

sudo -u hdfs hdfs dfs -chown hduser:hadoop /user/hive/warehouse

If you want to find the files/directories created on HDFS, then you have to interact with HDFS filesystem using CLI commands only

e.g. hdfs dfs -ls /

The data which is created on HDFS has a physical location on your local filesystem also, but you'll not see that location as files and directories. Look for the dfs.namenode.name.dir and dfs.datanode.data.dir properties in 'hdfs-site.xml' under your installation, usually located at: "/usr/local/hadoop/etc/hadoop/hdfs-site.xml"