I'm running a single-node cluster using hadoop version 1.0.1 and Ubuntu linux 11.10. I was running a simple script when it crashed, probably because my computer went to sleep. I tried to reformat the file system using
bin/hadoop namenode -format
and got the following error:
ERROR namenode.NameNode: java.io.IOException: Cannot lock storage /app/hadoop/tmp/dfs/name. The directory is already locked. at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.lock(Storage.java:602)
I try to add the input files using the command:
bin/hadoop fs -copyFromLocal dataDirectory/*.txt inputDirectory
and get the error:
12/04/15 09:05:21 WARN hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /home/hduser/input/book1.txt could only be replicated to 0 nodes, instead of 1
12/04/15 09:05:21 WARN hdfs.DFSClient: Error Recovery for block null bad datanode[0] nodes == null 12/04/15 09:05:21 WARN hdfs.DFSClient: Could not get block locations. Source file "/home/hduser/input/book1.txt" - Aborting...
Afterwards, I see the files in the input directory, but their sizes are 0. Any ideas on how I can add the files? I was able to add the files before hadoop crashed, so I can reinstall linux and hadoop, but it seems like overkill. Thanks.