Normally this is due to the machine having been rebooted and /tmp being cleared out. You do not want to leave the Hadoop name node or data node storage in /tmp for this reason. Make sure you properly configure dfs.name.dir and dfs.data.dir to point to directories outside of /tmp and other directories that may be cleared on boot.
The quick setup guide is really just to help you start experimenting with Hadoop. For setting up a cluster for any real use, you'll want to follow the next guide - Cluster Setup - http://hadoop.apache.org/common/docs/current/cluster_setup.html On Wed, May 12, 2010 at 6:58 AM, Michael Robinson <hadoopmich...@gmail.com> wrote: > Please help!!! > > > > I just downloaded and installed Hadoop-0.20.2 in Ubuntu following the > instructions in > http://hadoop.apache.org/common/docs/r0.20.2/quickstart.html. > > > > I did NOT get any errors during the installation, however when I try to run > the example programs I get the following error in the namenode.log > > > > INFO org.apache.hadoop.ipc.Server: Stopping server on 9000 > > > > ERROR org.apache.hadoop.hdfs.serve.namenode.NameNode: > > org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: > > > > Directory /tmp/hadoop-root/dfs/name is in an inconsistent state: storage > directory DOES NOT exist or is NOT accesible > > at > > org.apache.hadoop.hdfs.server.namenode.FSImage.recoverTransitionRead > > (FSImage.java:290) ... 87,311,292,201,965. > > > > > > then it shutsdown > > > > leaving the examples in a HANGING state. > > > > > > Where do I create the tmp directories, how many of them, and what are their > names > > > > > > Thank you > > > > Michael Robinson > > -- Eric Sammer phone: +1-917-287-2675 twitter: esammer data: www.cloudera.com