I restarted the cluster after the server was way overload by other task and
now I get this
2011-02-23 08:36:18,307 ERROR
org.apache.hadoop.hdfs.server.namenode.NameNode:
java.lang.NullPointerException
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.addChild(FSDirectory.java:1088)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.addChild(FSDirectory.java:1100)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.addNode(FSDirectory.java:1003)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.unprotectedAddFile(FSDirectory.java:206)
at
org.apache.hadoop.hdfs.server.namenode.FSEditLog.loadFSEdits(FSEditLog.java:637)
at
org.apache.hadoop.hdfs.server.namenode.FSImage.loadFSEdits(FSImage.java:1034)
at
org.apache.hadoop.hdfs.server.namenode.FSImage.loadFSImage(FSImage.java:845)
at
org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead(FSImage.java:379)
at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.loadFSImage(FSDirectory.java:99)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.initialize(FSNamesystem.java:347)
at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.<init>(FSNamesystem.java:321)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:267)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:461)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1202)
at
org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1211)
any ideas on how to fix or howtos out there I can not find?
- NullPointerException on namenode gmane.org
-