I am new to Hadoop and have installed hadoop 3.1.2 on Ubuntu 16.04 in standalone mode. When I try to start daemons using start-all.sh, the command says that it is starting different daemons. However, when I check with jps, there is nothing other than jps
(sparkVenv) applied@nadeem-Inspiron-5558:~$ start-all.sh
WARNING: Attempting to start all Apache Hadoop daemons as applied in 10 seconds.
WARNING: This is not a recommended production deployment configuration.
WARNING: Use CTRL-C to abort.
Starting namenodes on [nadeem-Inspiron-5558]
Starting datanodes
Starting secondary namenodes [nadeem-Inspiron-5558]
Starting resourcemanager
Starting nodemanagers
(sparkVenv) applied@nadeem-Inspiron-5558:~$ jps
21729 Jps
(sparkVenv) applied@nadeem-Inspiron-5558:~$
Here is a portion of log of namenode
************************************************************/
2019-05-06 15:36:43,116 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT]
2019-05-06 15:36:43,252 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode []
2019-05-06 15:36:43,515 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2019-05-06 15:36:43,635 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s).
2019-05-06 15:36:43,636 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system started
2019-05-06 15:36:43,671 INFO org.apache.hadoop.hdfs.server.namenode.NameNodeUtils: fs.defaultFS is file:///
2019-05-06 15:36:43,816 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: Failed to start namenode.
java.lang.IllegalArgumentException: Invalid URI for NameNode address (check fs.defaultFS): file:/// has no authority.
at org.apache.hadoop.hdfs.DFSUtilClient.getNNAddress(DFSUtilClient.java:697)
at org.apache.hadoop.hdfs.DFSUtilClient.getNNAddressCheckLogical(DFSUtilClient.java:726)
at org.apache.hadoop.hdfs.DFSUtilClient.getNNAddress(DFSUtilClient.java:688)
at org.apache.hadoop.hdfs.server.namenode.NameNode.getRpcServerAddress(NameNode.java:529)
at org.apache.hadoop.hdfs.server.namenode.NameNode.loginAsNameNodeUser(NameNode.java:660)
at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:680)
at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:937)
at org.apache.hadoop.hdfs.server.namenode.NameNode.(NameNode.java:910)
at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1643)
at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1710)
2019-05-06 15:36:43,819 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1: java.lang.IllegalArgumentException: Invalid URI for NameNode address (check fs.defaultFS): file:/// has no authority.
2019-05-06 15:36:43,821 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at nadeem-Inspiron-5558/127.0.1.1
************************************************************/