0
votes

I am using cloudera hadoop and just added a node to the existing cluster, but not able to start hdfs role on the node. Following is the exception i got:

FATAL   org.apache.hadoop.hdfs.server.datanode.DataNode 
Exception in secureMain

java.lang.InternalError
at sun.security.ec.SunEC.initialize(Native Method)
at sun.security.ec.SunEC.access$000(SunEC.java:49)
at sun.security.ec.SunEC$1.run(SunEC.java:61)
at sun.security.ec.SunEC$1.run(SunEC.java:58)
at java.security.AccessController.doPrivileged(Native Method)
at sun.security.ec.SunEC.<clinit>(SunEC.java:58)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at java.lang.Class.newInstance(Class.java:383)
at sun.security.jca.ProviderConfig$2.run(ProviderConfig.java:221)
at sun.security.jca.ProviderConfig$2.run(ProviderConfig.java:206)
at java.security.AccessController.doPrivileged(Native Method)
at sun.security.jca.ProviderConfig.doLoadProvider(ProviderConfig.java:206)
at sun.security.jca.ProviderConfig.getProvider(ProviderConfig.java:187)
at sun.security.jca.ProviderList.getProvider(ProviderList.java:233)
at sun.security.jca.ProviderList$ServiceList.tryGet(ProviderList.java:434)
at sun.security.jca.ProviderList$ServiceList.access$200(ProviderList.java:376)
at sun.security.jca.ProviderList$ServiceList$1.hasNext(ProviderList.java:486)
at javax.crypto.KeyGenerator.nextSpi(KeyGenerator.java:339)
at javax.crypto.KeyGenerator.<init>(KeyGenerator.java:169)
at javax.crypto.KeyGenerator.getInstance(KeyGenerator.java:224)
at org.apache.hadoop.security.token.SecretManager.<init>(SecretManager.java:143)
at org.apache.hadoop.hdfs.security.token.block.BlockPoolTokenSecretManager.<init>(BlockPoolTokenSecretManager.java:36)
at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1076)
at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:411)
at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2301)
at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2188)
at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2235)
at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2411)
at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2435)

Please help if you know the solution.

1
sudo chmod 755 /path/to/new/Datanode/ - try to give all permissions to new datanode dirRonak Patel
Its Already there!Sujit Rai

1 Answers

0
votes

Just downgraded openJDK version and it worked!!