Re: [help]how to stop HDFS

2011-11-30 Thread Steve Loughran

On 30/11/11 04:29, Nitin Khandelwal wrote:

Thanks,
I missed the sbin directory, was using the normal bin directory.
Thanks,
Nitin

On 30 November 2011 09:54, Harsh Jha...@cloudera.com  wrote:


Like I wrote earlier, its in the $HADOOP_HOME/sbin directory. Not the
regular bin/ directory.

On Wed, Nov 30, 2011 at 9:52 AM, Nitin Khandelwal
nitin.khandel...@germinait.com  wrote:

I am using Hadoop 0.23.0
There is no hadoop-daemon.sh in bin directory..



I found the 0.23 scripts to be hard to set up, and get working

https://issues.apache.org/jira/browse/HADOOP-7838
https://issues.apache.org/jira/browse/MAPREDUCE-3430
https://issues.apache.org/jira/browse/MAPREDUCE-3432

I'd like to see what Bigtop will offer in this area, as their test 
process will involve installing onto system images and walking through 
the scripts. the basic hadoop tars assume your system is well configured 
and you know how to do this -and debug problems


Re: Re: [help]how to stop HDFS

2011-11-30 Thread cat fa
Thank you for your help.
I can use /sbin/hadoop-daemon.sh {start|stop} {service} script to start a
namenode, but I can't start a resourcemanager.

2011/11/30 Harsh J ha...@cloudera.com

 I simply use the /sbin/hadoop-daemon.sh {start|stop} {service} script
 to control daemons at my end.

 Does this not work for you? Or perhaps this thread is more about
 documenting that?

 2011/11/30 Nitin Khandelwal nitin.khandel...@germinait.com:
  Hi,
 
  Even i am facing the same problem. There may be some issue with script .
  The doc says to start namenode type :
  bin/hdfs namenode start
 
  But start is not recognized. There is a hack to start namenode with
  command bin/hdfs namenode  , but no idea how to stop.
  If it had been a issue with config , the later also should not have
 worked.
 
  Thanks,
  Nitin
 
 
  2011/11/30 cat fa boost.subscrib...@gmail.com
 
  In fact it's me to say sorry. I used the word install which was
  misleading.
 
  In fact I downloaded a tar file and extracted it to /usr/bin/hadoop
 
  Could you please tell me where to point those variables?
 
  2011/11/30, Prashant Sharma prashant.ii...@gmail.com:
   I am sorry, I had no idea you have done a rpm install, my suggestion
 was
   based on the assumption that you have done a tar extract install where
  all
   three distribution have to extracted and then export variables.
   Also I have no experience with rpm based installs - so no comments
 about
   what went wrong in your case.
  
   Basically from the error i can say that it is not able to find the
 jars
   needed  on classpath which is referred by scripts through
   HADOOP_COMMON_HOME. I would say check with the access permission as in
   which user was it installed with and which user is it running with ?
  
   On Tue, Nov 29, 2011 at 10:48 PM, cat fa boost.subscrib...@gmail.com
  wrote:
  
   Thank you for your help, but I'm still a little confused.
   Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
   point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
   point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?
  
   2011/11/30 Prashant Sharma prashant.ii...@gmail.com
  
I mean, you have to export the variables
   
export HADOOP_CONF_DIR=/path/to/your/configdirectory.
   
also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run
 your
command. I suppose this should fix the problem.
-P
   
On Tue, Nov 29, 2011 at 6:23 PM, cat fa 
 boost.subscrib...@gmail.com
wrote:
   
 it didn't work. It gave me the Usage information.

 2011/11/29 hailong.yang1115 hailong.yang1...@gmail.com

  Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
$HADOOP_CONF_DIR
  and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config
$HADOOP_CONF_DIR.
  It would stop namenode and datanode separately.
  The HADOOP_CONF_DIR is the directory where you store your
   configuration
  files.
  Hailong
 
 
 
 
  ***
  * Hailong Yang, PhD. Candidate
  * Sino-German Joint Software Institute,
  * School of Computer ScienceEngineering, Beihang University
  * Phone: (86-010)82315908
  * Email: hailong.yang1...@gmail.com
  * Address: G413, New Main Building in Beihang University,
  *  No.37 XueYuan Road,HaiDian District,
  *  Beijing,P.R.China,100191
  ***
 
  From: cat fa
  Date: 2011-11-29 20:22
  To: common-user
  Subject: Re: [help]how to stop HDFS
  use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.
 
  you mean which class? the class of hadoop or of java?
 
  2011/11/29 Prashant Sharma prashant.ii...@gmail.com
 
   Try making $HADOOP_CONF point to right classpath including
 your
   configuration folder.
  
  
   On Tue, Nov 29, 2011 at 3:58 PM, cat fa 
   boost.subscrib...@gmail.com

   wrote:
  
I used the command :
   
$HADOOP_PREFIX_HOME/bin/hdfs start namenode --config
$HADOOP_CONF_DIR
   
to sart HDFS.
   
This command is in Hadoop document (here

   
  
 

   
  
 
 http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
)
   
However, I got errors as
   
Exception in thread main
  java.lang.NoClassDefFoundError:start
   
Could anyone tell me how to start and stop HDFS?
   
By the way, how to set Gmail so that it doesn't top post my
   reply?
   
  
 

   
  
  
 
 
 
 
  --
 
 
  Nitin Khandelwal
 



 --
 Harsh J



Re: [help]how to stop HDFS

2011-11-30 Thread cat fa
It seems the ClassNotFoundException exception is the most common problem.
Try point HADOOP_COMMON_HOME to HADOOP_HOME/share/hadoop/common.

In my computer it's /usr/bin/hadoop/share/hadoop/common

在 2011年11月30日 下午6:50,hailong.yang1115 hailong.yang1...@gmail.com写道:

 Actually I started to play with the latest release 0.23.0 on two nodes
 yesterday. And it was easy to start the hdfs. However it took me a while to
 configure the yarn. I set the variables HADOOP_COMMON_HOME to where you
 extracted the tarball and HADOOP_HDFS_HOME to the local dir where I pointed
 the hdfs to. After that I could bring up yarn and run the benchmark. But I
 am facing a problem that I could not see the jobs in the UI. And also when
 I started the historyserver, I got the following error.

 11/11/30 20:53:19 FATAL hs.JobHistoryServer: Error starting
 JobHistoryServer
 java.lang.RuntimeException: java.lang.ClassNotFoundException:
 org.apache.hadoop.fs.Hdfs
 at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1179)
at
 org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:142)
at
 org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:233)
at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:315)
at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:313)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1152)
at
 org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:313)
at
 org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:426)
at
 org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:448)
at
 org.apache.hadoop.mapreduce.v2.hs.JobHistory.init(JobHistory.java:183)
at
 org.apache.hadoop.yarn.service.CompositeService.init(CompositeService.java:58)
at
 org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.init(JobHistoryServer.java:62)
at
 org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.main(JobHistoryServer.java:77)
 Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.Hdfs
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at
 org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1125)
at
 org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1177)
... 14 more
 Any clue?

 Hailong




 ***
 * Hailong Yang, PhD. Candidate
 * Sino-German Joint Software Institute,
 * School of Computer ScienceEngineering, Beihang University
 * Phone: (86-010)82315908
 * Email: hailong.yang1...@gmail.com
 * Address: G413, New Main Building in Beihang University,
 *  No.37 XueYuan Road,HaiDian District,
 *  Beijing,P.R.China,100191
 ***

 发件人: cat fa
 发送时间: 2011-11-30 10:28
 收件人: common-user
 主题: Re: Re: [help]how to stop HDFS
 In fact it's me to say sorry. I used the word install which was
 misleading.

 In fact I downloaded a tar file and extracted it to /usr/bin/hadoop

 Could you please tell me where to point those variables?

 2011/11/30, Prashant Sharma prashant.ii...@gmail.com:
  I am sorry, I had no idea you have done a rpm install, my suggestion was
  based on the assumption that you have done a tar extract install where
 all
  three distribution have to extracted and then export variables.
  Also I have no experience with rpm based installs - so no comments about
  what went wrong in your case.
 
  Basically from the error i can say that it is not able to find the jars
  needed  on classpath which is referred by scripts through
  HADOOP_COMMON_HOME. I would say check with the access permission as in
  which user was it installed with and which user is it running with ?
 
  On Tue, Nov 29, 2011 at 10:48 PM, cat fa boost.subscrib...@gmail.com
 wrote:
 
  Thank you for your help, but I'm still a little confused.
  Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
  point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
  point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?
 
  2011/11/30 Prashant Sharma prashant.ii...@gmail.com
 
   I mean, you have to export the variables
  
   export HADOOP_CONF_DIR=/path/to/your/configdirectory.
  
   also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run your
   command. I suppose this should fix

Re: [help]how to stop HDFS

2011-11-29 Thread Prashant Sharma
Try making $HADOOP_CONF point to right classpath including your
configuration folder.


On Tue, Nov 29, 2011 at 3:58 PM, cat fa boost.subscrib...@gmail.com wrote:

 I used the command :

 $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config $HADOOP_CONF_DIR

 to sart HDFS.

 This command is in Hadoop document (here
 
 http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
 )

 However, I got errors as

 Exception in thread main java.lang.NoClassDefFoundError:start

 Could anyone tell me how to start and stop HDFS?

 By the way, how to set Gmail so that it doesn't top post my reply?



Re: [help]how to stop HDFS

2011-11-29 Thread cat fa
use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.

you mean which class? the class of hadoop or of java?

2011/11/29 Prashant Sharma prashant.ii...@gmail.com

 Try making $HADOOP_CONF point to right classpath including your
 configuration folder.


 On Tue, Nov 29, 2011 at 3:58 PM, cat fa boost.subscrib...@gmail.com
 wrote:

  I used the command :
 
  $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config $HADOOP_CONF_DIR
 
  to sart HDFS.
 
  This command is in Hadoop document (here
  
 
 http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
  )
 
  However, I got errors as
 
  Exception in thread main java.lang.NoClassDefFoundError:start
 
  Could anyone tell me how to start and stop HDFS?
 
  By the way, how to set Gmail so that it doesn't top post my reply?
 



Re: Re: [help]how to stop HDFS

2011-11-29 Thread hailong . yang1115
Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config $HADOOP_CONF_DIR and 
$HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config $HADOOP_CONF_DIR. It would 
stop namenode and datanode separately.
The HADOOP_CONF_DIR is the directory where you store your configuration files.
Hailong




***
* Hailong Yang, PhD. Candidate 
* Sino-German Joint Software Institute, 
* School of Computer ScienceEngineering, Beihang University
* Phone: (86-010)82315908
* Email: hailong.yang1...@gmail.com
* Address: G413, New Main Building in Beihang University, 
*  No.37 XueYuan Road,HaiDian District, 
*  Beijing,P.R.China,100191
***

From: cat fa
Date: 2011-11-29 20:22
To: common-user
Subject: Re: [help]how to stop HDFS
use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.

you mean which class? the class of hadoop or of java?

2011/11/29 Prashant Sharma prashant.ii...@gmail.com

 Try making $HADOOP_CONF point to right classpath including your
 configuration folder.


 On Tue, Nov 29, 2011 at 3:58 PM, cat fa boost.subscrib...@gmail.com
 wrote:

  I used the command :
 
  $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config $HADOOP_CONF_DIR
 
  to sart HDFS.
 
  This command is in Hadoop document (here
  
 
 http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
  )
 
  However, I got errors as
 
  Exception in thread main java.lang.NoClassDefFoundError:start
 
  Could anyone tell me how to start and stop HDFS?
 
  By the way, how to set Gmail so that it doesn't top post my reply?
 


Re: Re: [help]how to stop HDFS

2011-11-29 Thread cat fa
it didn't work. It gave me the Usage information.

2011/11/29 hailong.yang1115 hailong.yang1...@gmail.com

 Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config $HADOOP_CONF_DIR
 and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config $HADOOP_CONF_DIR.
 It would stop namenode and datanode separately.
 The HADOOP_CONF_DIR is the directory where you store your configuration
 files.
 Hailong




 ***
 * Hailong Yang, PhD. Candidate
 * Sino-German Joint Software Institute,
 * School of Computer ScienceEngineering, Beihang University
 * Phone: (86-010)82315908
 * Email: hailong.yang1...@gmail.com
 * Address: G413, New Main Building in Beihang University,
 *  No.37 XueYuan Road,HaiDian District,
 *  Beijing,P.R.China,100191
 ***

 From: cat fa
 Date: 2011-11-29 20:22
 To: common-user
 Subject: Re: [help]how to stop HDFS
 use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.

 you mean which class? the class of hadoop or of java?

 2011/11/29 Prashant Sharma prashant.ii...@gmail.com

  Try making $HADOOP_CONF point to right classpath including your
  configuration folder.
 
 
  On Tue, Nov 29, 2011 at 3:58 PM, cat fa boost.subscrib...@gmail.com
  wrote:
 
   I used the command :
  
   $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config $HADOOP_CONF_DIR
  
   to sart HDFS.
  
   This command is in Hadoop document (here
   
  
 
 http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
   )
  
   However, I got errors as
  
   Exception in thread main java.lang.NoClassDefFoundError:start
  
   Could anyone tell me how to start and stop HDFS?
  
   By the way, how to set Gmail so that it doesn't top post my reply?
  
 



Re: Re: [help]how to stop HDFS

2011-11-29 Thread Prashant Sharma
I mean, you have to export the variables

export HADOOP_CONF_DIR=/path/to/your/configdirectory.

also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run your
command. I suppose this should fix the problem.
-P

On Tue, Nov 29, 2011 at 6:23 PM, cat fa boost.subscrib...@gmail.com wrote:

 it didn't work. It gave me the Usage information.

 2011/11/29 hailong.yang1115 hailong.yang1...@gmail.com

  Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config $HADOOP_CONF_DIR
  and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config $HADOOP_CONF_DIR.
  It would stop namenode and datanode separately.
  The HADOOP_CONF_DIR is the directory where you store your configuration
  files.
  Hailong
 
 
 
 
  ***
  * Hailong Yang, PhD. Candidate
  * Sino-German Joint Software Institute,
  * School of Computer ScienceEngineering, Beihang University
  * Phone: (86-010)82315908
  * Email: hailong.yang1...@gmail.com
  * Address: G413, New Main Building in Beihang University,
  *  No.37 XueYuan Road,HaiDian District,
  *  Beijing,P.R.China,100191
  ***
 
  From: cat fa
  Date: 2011-11-29 20:22
  To: common-user
  Subject: Re: [help]how to stop HDFS
  use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.
 
  you mean which class? the class of hadoop or of java?
 
  2011/11/29 Prashant Sharma prashant.ii...@gmail.com
 
   Try making $HADOOP_CONF point to right classpath including your
   configuration folder.
  
  
   On Tue, Nov 29, 2011 at 3:58 PM, cat fa boost.subscrib...@gmail.com
   wrote:
  
I used the command :
   
$HADOOP_PREFIX_HOME/bin/hdfs start namenode --config $HADOOP_CONF_DIR
   
to sart HDFS.
   
This command is in Hadoop document (here

   
  
 
 http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
)
   
However, I got errors as
   
Exception in thread main java.lang.NoClassDefFoundError:start
   
Could anyone tell me how to start and stop HDFS?
   
By the way, how to set Gmail so that it doesn't top post my reply?
   
  
 



Re: Re: [help]how to stop HDFS

2011-11-29 Thread Prashant Sharma
I am sorry, I had no idea you have done a rpm install, my suggestion was
based on the assumption that you have done a tar extract install where all
three distribution have to extracted and then export variables.
Also I have no experience with rpm based installs - so no comments about
what went wrong in your case.

Basically from the error i can say that it is not able to find the jars
needed  on classpath which is referred by scripts through
HADOOP_COMMON_HOME. I would say check with the access permission as in
which user was it installed with and which user is it running with ?

On Tue, Nov 29, 2011 at 10:48 PM, cat fa boost.subscrib...@gmail.comwrote:

 Thank you for your help, but I'm still a little confused.
 Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
 point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
 point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?

 2011/11/30 Prashant Sharma prashant.ii...@gmail.com

  I mean, you have to export the variables
 
  export HADOOP_CONF_DIR=/path/to/your/configdirectory.
 
  also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run your
  command. I suppose this should fix the problem.
  -P
 
  On Tue, Nov 29, 2011 at 6:23 PM, cat fa boost.subscrib...@gmail.com
  wrote:
 
   it didn't work. It gave me the Usage information.
  
   2011/11/29 hailong.yang1115 hailong.yang1...@gmail.com
  
Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
  $HADOOP_CONF_DIR
and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config
  $HADOOP_CONF_DIR.
It would stop namenode and datanode separately.
The HADOOP_CONF_DIR is the directory where you store your
 configuration
files.
Hailong
   
   
   
   
***
* Hailong Yang, PhD. Candidate
* Sino-German Joint Software Institute,
* School of Computer ScienceEngineering, Beihang University
* Phone: (86-010)82315908
* Email: hailong.yang1...@gmail.com
* Address: G413, New Main Building in Beihang University,
*  No.37 XueYuan Road,HaiDian District,
*  Beijing,P.R.China,100191
***
   
From: cat fa
Date: 2011-11-29 20:22
To: common-user
Subject: Re: [help]how to stop HDFS
use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.
   
you mean which class? the class of hadoop or of java?
   
2011/11/29 Prashant Sharma prashant.ii...@gmail.com
   
 Try making $HADOOP_CONF point to right classpath including your
 configuration folder.


 On Tue, Nov 29, 2011 at 3:58 PM, cat fa 
 boost.subscrib...@gmail.com
  
 wrote:

  I used the command :
 
  $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config
  $HADOOP_CONF_DIR
 
  to sart HDFS.
 
  This command is in Hadoop document (here
  
 

   
  
 
 http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
  )
 
  However, I got errors as
 
  Exception in thread main java.lang.NoClassDefFoundError:start
 
  Could anyone tell me how to start and stop HDFS?
 
  By the way, how to set Gmail so that it doesn't top post my
 reply?
 

   
  
 



Re: Re: [help]how to stop HDFS

2011-11-29 Thread cat fa
In fact it's me to say sorry. I used the word install which was misleading.

In fact I downloaded a tar file and extracted it to /usr/bin/hadoop

Could you please tell me where to point those variables?

2011/11/30, Prashant Sharma prashant.ii...@gmail.com:
 I am sorry, I had no idea you have done a rpm install, my suggestion was
 based on the assumption that you have done a tar extract install where all
 three distribution have to extracted and then export variables.
 Also I have no experience with rpm based installs - so no comments about
 what went wrong in your case.

 Basically from the error i can say that it is not able to find the jars
 needed  on classpath which is referred by scripts through
 HADOOP_COMMON_HOME. I would say check with the access permission as in
 which user was it installed with and which user is it running with ?

 On Tue, Nov 29, 2011 at 10:48 PM, cat fa boost.subscrib...@gmail.comwrote:

 Thank you for your help, but I'm still a little confused.
 Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
 point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
 point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?

 2011/11/30 Prashant Sharma prashant.ii...@gmail.com

  I mean, you have to export the variables
 
  export HADOOP_CONF_DIR=/path/to/your/configdirectory.
 
  also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run your
  command. I suppose this should fix the problem.
  -P
 
  On Tue, Nov 29, 2011 at 6:23 PM, cat fa boost.subscrib...@gmail.com
  wrote:
 
   it didn't work. It gave me the Usage information.
  
   2011/11/29 hailong.yang1115 hailong.yang1...@gmail.com
  
Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
  $HADOOP_CONF_DIR
and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config
  $HADOOP_CONF_DIR.
It would stop namenode and datanode separately.
The HADOOP_CONF_DIR is the directory where you store your
 configuration
files.
Hailong
   
   
   
   
***
* Hailong Yang, PhD. Candidate
* Sino-German Joint Software Institute,
* School of Computer ScienceEngineering, Beihang University
* Phone: (86-010)82315908
* Email: hailong.yang1...@gmail.com
* Address: G413, New Main Building in Beihang University,
*  No.37 XueYuan Road,HaiDian District,
*  Beijing,P.R.China,100191
***
   
From: cat fa
Date: 2011-11-29 20:22
To: common-user
Subject: Re: [help]how to stop HDFS
use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.
   
you mean which class? the class of hadoop or of java?
   
2011/11/29 Prashant Sharma prashant.ii...@gmail.com
   
 Try making $HADOOP_CONF point to right classpath including your
 configuration folder.


 On Tue, Nov 29, 2011 at 3:58 PM, cat fa 
 boost.subscrib...@gmail.com
  
 wrote:

  I used the command :
 
  $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config
  $HADOOP_CONF_DIR
 
  to sart HDFS.
 
  This command is in Hadoop document (here
  
 

   
  
 
 http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
  )
 
  However, I got errors as
 
  Exception in thread main java.lang.NoClassDefFoundError:start
 
  Could anyone tell me how to start and stop HDFS?
 
  By the way, how to set Gmail so that it doesn't top post my
 reply?
 

   
  
 




Re: Re: [help]how to stop HDFS

2011-11-29 Thread Nitin Khandelwal
Hi,

Even i am facing the same problem. There may be some issue with script .
The doc says to start namenode type :
bin/hdfs namenode start

But start is not recognized. There is a hack to start namenode with
command bin/hdfs namenode  , but no idea how to stop.
If it had been a issue with config , the later also should not have worked.

Thanks,
Nitin


2011/11/30 cat fa boost.subscrib...@gmail.com

 In fact it's me to say sorry. I used the word install which was
 misleading.

 In fact I downloaded a tar file and extracted it to /usr/bin/hadoop

 Could you please tell me where to point those variables?

 2011/11/30, Prashant Sharma prashant.ii...@gmail.com:
  I am sorry, I had no idea you have done a rpm install, my suggestion was
  based on the assumption that you have done a tar extract install where
 all
  three distribution have to extracted and then export variables.
  Also I have no experience with rpm based installs - so no comments about
  what went wrong in your case.
 
  Basically from the error i can say that it is not able to find the jars
  needed  on classpath which is referred by scripts through
  HADOOP_COMMON_HOME. I would say check with the access permission as in
  which user was it installed with and which user is it running with ?
 
  On Tue, Nov 29, 2011 at 10:48 PM, cat fa boost.subscrib...@gmail.com
 wrote:
 
  Thank you for your help, but I'm still a little confused.
  Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
  point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
  point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?
 
  2011/11/30 Prashant Sharma prashant.ii...@gmail.com
 
   I mean, you have to export the variables
  
   export HADOOP_CONF_DIR=/path/to/your/configdirectory.
  
   also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run your
   command. I suppose this should fix the problem.
   -P
  
   On Tue, Nov 29, 2011 at 6:23 PM, cat fa boost.subscrib...@gmail.com
   wrote:
  
it didn't work. It gave me the Usage information.
   
2011/11/29 hailong.yang1115 hailong.yang1...@gmail.com
   
 Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
   $HADOOP_CONF_DIR
 and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config
   $HADOOP_CONF_DIR.
 It would stop namenode and datanode separately.
 The HADOOP_CONF_DIR is the directory where you store your
  configuration
 files.
 Hailong




 ***
 * Hailong Yang, PhD. Candidate
 * Sino-German Joint Software Institute,
 * School of Computer ScienceEngineering, Beihang University
 * Phone: (86-010)82315908
 * Email: hailong.yang1...@gmail.com
 * Address: G413, New Main Building in Beihang University,
 *  No.37 XueYuan Road,HaiDian District,
 *  Beijing,P.R.China,100191
 ***

 From: cat fa
 Date: 2011-11-29 20:22
 To: common-user
 Subject: Re: [help]how to stop HDFS
 use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.

 you mean which class? the class of hadoop or of java?

 2011/11/29 Prashant Sharma prashant.ii...@gmail.com

  Try making $HADOOP_CONF point to right classpath including your
  configuration folder.
 
 
  On Tue, Nov 29, 2011 at 3:58 PM, cat fa 
  boost.subscrib...@gmail.com
   
  wrote:
 
   I used the command :
  
   $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config
   $HADOOP_CONF_DIR
  
   to sart HDFS.
  
   This command is in Hadoop document (here
   
  
 

   
  
 
 http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
   )
  
   However, I got errors as
  
   Exception in thread main
 java.lang.NoClassDefFoundError:start
  
   Could anyone tell me how to start and stop HDFS?
  
   By the way, how to set Gmail so that it doesn't top post my
  reply?
  
 

   
  
 
 




-- 


Nitin Khandelwal


Re: Re: [help]how to stop HDFS

2011-11-29 Thread Harsh J
I simply use the /sbin/hadoop-daemon.sh {start|stop} {service} script
to control daemons at my end.

Does this not work for you? Or perhaps this thread is more about
documenting that?

2011/11/30 Nitin Khandelwal nitin.khandel...@germinait.com:
 Hi,

 Even i am facing the same problem. There may be some issue with script .
 The doc says to start namenode type :
 bin/hdfs namenode start

 But start is not recognized. There is a hack to start namenode with
 command bin/hdfs namenode  , but no idea how to stop.
 If it had been a issue with config , the later also should not have worked.

 Thanks,
 Nitin


 2011/11/30 cat fa boost.subscrib...@gmail.com

 In fact it's me to say sorry. I used the word install which was
 misleading.

 In fact I downloaded a tar file and extracted it to /usr/bin/hadoop

 Could you please tell me where to point those variables?

 2011/11/30, Prashant Sharma prashant.ii...@gmail.com:
  I am sorry, I had no idea you have done a rpm install, my suggestion was
  based on the assumption that you have done a tar extract install where
 all
  three distribution have to extracted and then export variables.
  Also I have no experience with rpm based installs - so no comments about
  what went wrong in your case.
 
  Basically from the error i can say that it is not able to find the jars
  needed  on classpath which is referred by scripts through
  HADOOP_COMMON_HOME. I would say check with the access permission as in
  which user was it installed with and which user is it running with ?
 
  On Tue, Nov 29, 2011 at 10:48 PM, cat fa boost.subscrib...@gmail.com
 wrote:
 
  Thank you for your help, but I'm still a little confused.
  Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
  point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
  point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?
 
  2011/11/30 Prashant Sharma prashant.ii...@gmail.com
 
   I mean, you have to export the variables
  
   export HADOOP_CONF_DIR=/path/to/your/configdirectory.
  
   also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run your
   command. I suppose this should fix the problem.
   -P
  
   On Tue, Nov 29, 2011 at 6:23 PM, cat fa boost.subscrib...@gmail.com
   wrote:
  
it didn't work. It gave me the Usage information.
   
2011/11/29 hailong.yang1115 hailong.yang1...@gmail.com
   
 Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
   $HADOOP_CONF_DIR
 and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config
   $HADOOP_CONF_DIR.
 It would stop namenode and datanode separately.
 The HADOOP_CONF_DIR is the directory where you store your
  configuration
 files.
 Hailong




 ***
 * Hailong Yang, PhD. Candidate
 * Sino-German Joint Software Institute,
 * School of Computer ScienceEngineering, Beihang University
 * Phone: (86-010)82315908
 * Email: hailong.yang1...@gmail.com
 * Address: G413, New Main Building in Beihang University,
 *  No.37 XueYuan Road,HaiDian District,
 *  Beijing,P.R.China,100191
 ***

 From: cat fa
 Date: 2011-11-29 20:22
 To: common-user
 Subject: Re: [help]how to stop HDFS
 use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.

 you mean which class? the class of hadoop or of java?

 2011/11/29 Prashant Sharma prashant.ii...@gmail.com

  Try making $HADOOP_CONF point to right classpath including your
  configuration folder.
 
 
  On Tue, Nov 29, 2011 at 3:58 PM, cat fa 
  boost.subscrib...@gmail.com
   
  wrote:
 
   I used the command :
  
   $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config
   $HADOOP_CONF_DIR
  
   to sart HDFS.
  
   This command is in Hadoop document (here
   
  
 

   
  
 
 http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
   )
  
   However, I got errors as
  
   Exception in thread main
 java.lang.NoClassDefFoundError:start
  
   Could anyone tell me how to start and stop HDFS?
  
   By the way, how to set Gmail so that it doesn't top post my
  reply?
  
 

   
  
 
 




 --


 Nitin Khandelwal




-- 
Harsh J


Re: Re: [help]how to stop HDFS

2011-11-29 Thread Nitin Khandelwal
I am using Hadoop 0.23.0
There is no hadoop-daemon.sh in bin directory..

Thanks,
Nitin

On 30 November 2011 09:49, Harsh J ha...@cloudera.com wrote:

 I simply use the /sbin/hadoop-daemon.sh {start|stop} {service} script
 to control daemons at my end.

 Does this not work for you? Or perhaps this thread is more about
 documenting that?

 2011/11/30 Nitin Khandelwal nitin.khandel...@germinait.com:
  Hi,
 
  Even i am facing the same problem. There may be some issue with script .
  The doc says to start namenode type :
  bin/hdfs namenode start
 
  But start is not recognized. There is a hack to start namenode with
  command bin/hdfs namenode  , but no idea how to stop.
  If it had been a issue with config , the later also should not have
 worked.
 
  Thanks,
  Nitin
 
 
  2011/11/30 cat fa boost.subscrib...@gmail.com
 
  In fact it's me to say sorry. I used the word install which was
  misleading.
 
  In fact I downloaded a tar file and extracted it to /usr/bin/hadoop
 
  Could you please tell me where to point those variables?
 
  2011/11/30, Prashant Sharma prashant.ii...@gmail.com:
   I am sorry, I had no idea you have done a rpm install, my suggestion
 was
   based on the assumption that you have done a tar extract install where
  all
   three distribution have to extracted and then export variables.
   Also I have no experience with rpm based installs - so no comments
 about
   what went wrong in your case.
  
   Basically from the error i can say that it is not able to find the
 jars
   needed  on classpath which is referred by scripts through
   HADOOP_COMMON_HOME. I would say check with the access permission as in
   which user was it installed with and which user is it running with ?
  
   On Tue, Nov 29, 2011 at 10:48 PM, cat fa boost.subscrib...@gmail.com
  wrote:
  
   Thank you for your help, but I'm still a little confused.
   Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
   point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
   point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?
  
   2011/11/30 Prashant Sharma prashant.ii...@gmail.com
  
I mean, you have to export the variables
   
export HADOOP_CONF_DIR=/path/to/your/configdirectory.
   
also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run
 your
command. I suppose this should fix the problem.
-P
   
On Tue, Nov 29, 2011 at 6:23 PM, cat fa 
 boost.subscrib...@gmail.com
wrote:
   
 it didn't work. It gave me the Usage information.

 2011/11/29 hailong.yang1115 hailong.yang1...@gmail.com

  Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
$HADOOP_CONF_DIR
  and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config
$HADOOP_CONF_DIR.
  It would stop namenode and datanode separately.
  The HADOOP_CONF_DIR is the directory where you store your
   configuration
  files.
  Hailong
 
 
 
 
  ***
  * Hailong Yang, PhD. Candidate
  * Sino-German Joint Software Institute,
  * School of Computer ScienceEngineering, Beihang University
  * Phone: (86-010)82315908
  * Email: hailong.yang1...@gmail.com
  * Address: G413, New Main Building in Beihang University,
  *  No.37 XueYuan Road,HaiDian District,
  *  Beijing,P.R.China,100191
  ***
 
  From: cat fa
  Date: 2011-11-29 20:22
  To: common-user
  Subject: Re: [help]how to stop HDFS
  use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.
 
  you mean which class? the class of hadoop or of java?
 
  2011/11/29 Prashant Sharma prashant.ii...@gmail.com
 
   Try making $HADOOP_CONF point to right classpath including
 your
   configuration folder.
  
  
   On Tue, Nov 29, 2011 at 3:58 PM, cat fa 
   boost.subscrib...@gmail.com

   wrote:
  
I used the command :
   
$HADOOP_PREFIX_HOME/bin/hdfs start namenode --config
$HADOOP_CONF_DIR
   
to sart HDFS.
   
This command is in Hadoop document (here

   
  
 

   
  
 
 http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
)
   
However, I got errors as
   
Exception in thread main
  java.lang.NoClassDefFoundError:start
   
Could anyone tell me how to start and stop HDFS?
   
By the way, how to set Gmail so that it doesn't top post my
   reply?
   
  
 

   
  
  
 
 
 
 
  --
 
 
  Nitin Khandelwal
 



 --
 Harsh J




-- 


Nitin Khandelwal


Re: Re: [help]how to stop HDFS

2011-11-29 Thread Harsh J
Like I wrote earlier, its in the $HADOOP_HOME/sbin directory. Not the
regular bin/ directory.

On Wed, Nov 30, 2011 at 9:52 AM, Nitin Khandelwal
nitin.khandel...@germinait.com wrote:
 I am using Hadoop 0.23.0
 There is no hadoop-daemon.sh in bin directory..

 Thanks,
 Nitin

 On 30 November 2011 09:49, Harsh J ha...@cloudera.com wrote:

 I simply use the /sbin/hadoop-daemon.sh {start|stop} {service} script
 to control daemons at my end.

 Does this not work for you? Or perhaps this thread is more about
 documenting that?

 2011/11/30 Nitin Khandelwal nitin.khandel...@germinait.com:
  Hi,
 
  Even i am facing the same problem. There may be some issue with script .
  The doc says to start namenode type :
  bin/hdfs namenode start
 
  But start is not recognized. There is a hack to start namenode with
  command bin/hdfs namenode  , but no idea how to stop.
  If it had been a issue with config , the later also should not have
 worked.
 
  Thanks,
  Nitin
 
 
  2011/11/30 cat fa boost.subscrib...@gmail.com
 
  In fact it's me to say sorry. I used the word install which was
  misleading.
 
  In fact I downloaded a tar file and extracted it to /usr/bin/hadoop
 
  Could you please tell me where to point those variables?
 
  2011/11/30, Prashant Sharma prashant.ii...@gmail.com:
   I am sorry, I had no idea you have done a rpm install, my suggestion
 was
   based on the assumption that you have done a tar extract install where
  all
   three distribution have to extracted and then export variables.
   Also I have no experience with rpm based installs - so no comments
 about
   what went wrong in your case.
  
   Basically from the error i can say that it is not able to find the
 jars
   needed  on classpath which is referred by scripts through
   HADOOP_COMMON_HOME. I would say check with the access permission as in
   which user was it installed with and which user is it running with ?
  
   On Tue, Nov 29, 2011 at 10:48 PM, cat fa boost.subscrib...@gmail.com
  wrote:
  
   Thank you for your help, but I'm still a little confused.
   Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
   point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
   point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?
  
   2011/11/30 Prashant Sharma prashant.ii...@gmail.com
  
I mean, you have to export the variables
   
export HADOOP_CONF_DIR=/path/to/your/configdirectory.
   
also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run
 your
command. I suppose this should fix the problem.
-P
   
On Tue, Nov 29, 2011 at 6:23 PM, cat fa 
 boost.subscrib...@gmail.com
wrote:
   
 it didn't work. It gave me the Usage information.

 2011/11/29 hailong.yang1115 hailong.yang1...@gmail.com

  Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
$HADOOP_CONF_DIR
  and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config
$HADOOP_CONF_DIR.
  It would stop namenode and datanode separately.
  The HADOOP_CONF_DIR is the directory where you store your
   configuration
  files.
  Hailong
 
 
 
 
  ***
  * Hailong Yang, PhD. Candidate
  * Sino-German Joint Software Institute,
  * School of Computer ScienceEngineering, Beihang University
  * Phone: (86-010)82315908
  * Email: hailong.yang1...@gmail.com
  * Address: G413, New Main Building in Beihang University,
  *              No.37 XueYuan Road,HaiDian District,
  *              Beijing,P.R.China,100191
  ***
 
  From: cat fa
  Date: 2011-11-29 20:22
  To: common-user
  Subject: Re: [help]how to stop HDFS
  use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.
 
  you mean which class? the class of hadoop or of java?
 
  2011/11/29 Prashant Sharma prashant.ii...@gmail.com
 
   Try making $HADOOP_CONF point to right classpath including
 your
   configuration folder.
  
  
   On Tue, Nov 29, 2011 at 3:58 PM, cat fa 
   boost.subscrib...@gmail.com

   wrote:
  
I used the command :
   
$HADOOP_PREFIX_HOME/bin/hdfs start namenode --config
$HADOOP_CONF_DIR
   
to sart HDFS.
   
This command is in Hadoop document (here

   
  
 

   
  
 
 http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
)
   
However, I got errors as
   
Exception in thread main
  java.lang.NoClassDefFoundError:start
   
Could anyone tell me how to start and stop HDFS?
   
By the way, how to set Gmail so that it doesn't top post my
   reply?
   
  
 

   
  
  
 
 
 
 
  --
 
 
  Nitin Khandelwal
 



 --
 Harsh J




 --


 Nitin Khandelwal




-- 
Harsh J


Re: Re: [help]how to stop HDFS

2011-11-29 Thread Nitin Khandelwal
Thanks,
I missed the sbin directory, was using the normal bin directory.
Thanks,
Nitin

On 30 November 2011 09:54, Harsh J ha...@cloudera.com wrote:

 Like I wrote earlier, its in the $HADOOP_HOME/sbin directory. Not the
 regular bin/ directory.

 On Wed, Nov 30, 2011 at 9:52 AM, Nitin Khandelwal
 nitin.khandel...@germinait.com wrote:
  I am using Hadoop 0.23.0
  There is no hadoop-daemon.sh in bin directory..
 
  Thanks,
  Nitin
 
  On 30 November 2011 09:49, Harsh J ha...@cloudera.com wrote:
 
  I simply use the /sbin/hadoop-daemon.sh {start|stop} {service} script
  to control daemons at my end.
 
  Does this not work for you? Or perhaps this thread is more about
  documenting that?
 
  2011/11/30 Nitin Khandelwal nitin.khandel...@germinait.com:
   Hi,
  
   Even i am facing the same problem. There may be some issue with
 script .
   The doc says to start namenode type :
   bin/hdfs namenode start
  
   But start is not recognized. There is a hack to start namenode with
   command bin/hdfs namenode  , but no idea how to stop.
   If it had been a issue with config , the later also should not have
  worked.
  
   Thanks,
   Nitin
  
  
   2011/11/30 cat fa boost.subscrib...@gmail.com
  
   In fact it's me to say sorry. I used the word install which was
   misleading.
  
   In fact I downloaded a tar file and extracted it to /usr/bin/hadoop
  
   Could you please tell me where to point those variables?
  
   2011/11/30, Prashant Sharma prashant.ii...@gmail.com:
I am sorry, I had no idea you have done a rpm install, my
 suggestion
  was
based on the assumption that you have done a tar extract install
 where
   all
three distribution have to extracted and then export variables.
Also I have no experience with rpm based installs - so no comments
  about
what went wrong in your case.
   
Basically from the error i can say that it is not able to find the
  jars
needed  on classpath which is referred by scripts through
HADOOP_COMMON_HOME. I would say check with the access permission
 as in
which user was it installed with and which user is it running with
 ?
   
On Tue, Nov 29, 2011 at 10:48 PM, cat fa 
 boost.subscrib...@gmail.com
   wrote:
   
Thank you for your help, but I'm still a little confused.
Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?
   
2011/11/30 Prashant Sharma prashant.ii...@gmail.com
   
 I mean, you have to export the variables

 export HADOOP_CONF_DIR=/path/to/your/configdirectory.

 also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your
 run
  your
 command. I suppose this should fix the problem.
 -P

 On Tue, Nov 29, 2011 at 6:23 PM, cat fa 
  boost.subscrib...@gmail.com
 wrote:

  it didn't work. It gave me the Usage information.
 
  2011/11/29 hailong.yang1115 hailong.yang1...@gmail.com
 
   Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
 $HADOOP_CONF_DIR
   and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config
 $HADOOP_CONF_DIR.
   It would stop namenode and datanode separately.
   The HADOOP_CONF_DIR is the directory where you store your
configuration
   files.
   Hailong
  
  
  
  
   ***
   * Hailong Yang, PhD. Candidate
   * Sino-German Joint Software Institute,
   * School of Computer ScienceEngineering, Beihang University
   * Phone: (86-010)82315908
   * Email: hailong.yang1...@gmail.com
   * Address: G413, New Main Building in Beihang University,
   *  No.37 XueYuan Road,HaiDian District,
   *  Beijing,P.R.China,100191
   ***
  
   From: cat fa
   Date: 2011-11-29 20:22
   To: common-user
   Subject: Re: [help]how to stop HDFS
   use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop
 0.23.
  
   you mean which class? the class of hadoop or of java?
  
   2011/11/29 Prashant Sharma prashant.ii...@gmail.com
  
Try making $HADOOP_CONF point to right classpath including
  your
configuration folder.
   
   
On Tue, Nov 29, 2011 at 3:58 PM, cat fa 
boost.subscrib...@gmail.com
 
wrote:
   
 I used the command :

 $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config
 $HADOOP_CONF_DIR

 to sart HDFS.

 This command is in Hadoop document (here
 

   
  
 

   
  
 
 http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
 )

 However, I got errors as

 Exception in thread main
   java.lang.NoClassDefFoundError:start

 Could anyone tell me how to start