2
votes

I have prepared a 2 node cluster with plain apache Hadoop. These nodes acts as Kerberos client to another machines which acts as Kerberos Server. The KDC Db, principals of hdfs on each machines are created with their kaytab files with proper encryption types, using AES. The required hdfs-site, core-site, mapred-site, yarn-site and container-executor.cfg files are modified. Also for unlimited strength of security, the JCE policy files are also kept in $JAVA_HOME/lib/security directory.

When starting the namenode daemon, it is working fine. But while accessing the hdfs as

hadoop fs –ls /

we got the below error:

15/02/06 15:17:12 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "xxxxxxx/10.122.48.12"; destination host is: "xxxxxxx":8020;

If anyone has prior knowledge or has worked on Kerberos on top of Hadoop, kindly suggest us some solution on the above issue.

1

1 Answers

4
votes

To use Hadoop command, you need to use kinit command to get a Kerberos ticket first:

kinit [-kt user_keytab username]

Once it's done, you can list the ticket with:

klist

See cloudera's doc for more details: Verify that Kerberos Security is Working