I don't think you ought to be using HADOOP_HOME anymore.

Try "unset HADOOP_HOME" and then "export HADOOP_PREFIX=/opt/hadoop"
and retry the NN command.

On Sun, Aug 11, 2013 at 8:50 AM, Jane Wayne <jane.wayne2...@gmail.com> wrote:
> hi,
>
> i have downloaded and untarred hadoop v0.23.9. i am trying to set up a
> single node instance to learn this version of hadop. also, i am following
> as best as i can, the instructions at
> http://hadoop.apache.org/docs/r0.23.9/hadoop-project-dist/hadoop-common/SingleCluster.html
> .
>
> when i attempt to run ${HADOOP_HOME}/bin/hdfs namenode -format, i get the
> following error.
>
> Error: Could not find or load main class
> org.apache.hadoop.hdfs.server.namenode.NameNode
>
> the instructions in the link above are complete. they jump right in and
> say, "assuming you have installed hadoop-common/hadoop-hdfs..." what does
> this assumption even mean? how do we install hadoop-common and hadoop-hdfs?
>
> right now, i am running on CentOS 6.4 x64 minimal. my steps are the
> following.
>
> 0. installed jdk 1.7 (Oracle)
> 1. tar xfz hadoop-0.23.9.tar.gz
> 2. mv hadoop-0.23.9 /opt
> 3. ln -s /opt/hadoop-0.23.9 /opt/hadoop
> 4. export HADOOP_HOME=/opt/hadoop
> 5. export JAVA_HOME=/opt/java
> 6. export PATH=${JAVA_HOME}/bin:${HADOOP_HOME}/bin:${PATH}
>
> any help is appreciated.



-- 
Harsh J

Reply via email to