is not recognized. There is a hack to start namenode with
command bin/hdfs namenode , but no idea how to stop.
If it had been a issue with config , the later also should not have
worked.
Thanks,
Nitin
2011/11/30 cat fa boost.subscrib...@gmail.com
In fact it's me to say sorry
,
* No.37 XueYuan Road,HaiDian District,
* Beijing,P.R.China,100191
***
发件人: cat fa
发送时间: 2011-11-30 10:28
收件人: common-user
主题: Re: Re: [help]how to stop HDFS
In fact it's me to say sorry. I used the word install which
I used the command :
$HADOOP_PREFIX_HOME/bin/hdfs start namenode --config $HADOOP_CONF_DIR
to sart HDFS.
This command is in Hadoop document (here
http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html)
However, I got errors as
Exception in thread main
use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.
you mean which class? the class of hadoop or of java?
2011/11/29 Prashant Sharma prashant.ii...@gmail.com
Try making $HADOOP_CONF point to right classpath including your
configuration folder.
On Tue, Nov 29, 2011 at 3:58 PM, cat
-010)82315908
* Email: hailong.yang1...@gmail.com
* Address: G413, New Main Building in Beihang University,
* No.37 XueYuan Road,HaiDian District,
* Beijing,P.R.China,100191
***
From: cat fa
Date: 2011-11-29 20:22
To: common
the error i can say that it is not able to find the jars
needed on classpath which is referred by scripts through
HADOOP_COMMON_HOME. I would say check with the access permission as in
which user was it installed with and which user is it running with ?
On Tue, Nov 29, 2011 at 10:48 PM, cat fa