[help]how to stop HDFS

2011-11-30 Thread hailong . yang1115
Actually I started to play with the latest release 0.23.0 on two nodes 
yesterday. And it was easy to start the hdfs. However it took me a while to 
configure the yarn. I set the variables HADOOP_COMMON_HOME to where you 
extracted the tarball and HADOOP_HDFS_HOME to the local dir where I pointed the 
hdfs to. After that I could bring up yarn and run the benchmark. But I am 
facing a problem that I could not see the jobs in the UI. And also when I 
started the historyserver, I got the following error.

11/11/30 20:53:19 FATAL hs.JobHistoryServer: Error starting JobHistoryServer
java.lang.RuntimeException: java.lang.ClassNotFoundException: 
org.apache.hadoop.fs.Hdfs
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1179)
at 
org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:142)
at 
org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:233)
at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:315)
at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:313)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1152)
at 
org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:313)
at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:426)
at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:448)
at 
org.apache.hadoop.mapreduce.v2.hs.JobHistory.init(JobHistory.java:183)
at 
org.apache.hadoop.yarn.service.CompositeService.init(CompositeService.java:58)
at 
org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.init(JobHistoryServer.java:62)
at 
org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.main(JobHistoryServer.java:77)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.Hdfs
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:247)
at 
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1125)
at 
org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1177)
... 14 more
Any clue?

Hailong




***
* Hailong Yang, PhD. Candidate 
* Sino-German Joint Software Institute, 
* School of Computer ScienceEngineering, Beihang University
* Phone: (86-010)82315908
* Email: hailong.yang1...@gmail.com
* Address: G413, New Main Building in Beihang University, 
*  No.37 XueYuan Road,HaiDian District, 
*  Beijing,P.R.China,100191
***

发件人: cat fa
发送时间: 2011-11-30 10:28
收件人: common-user
主题: Re: Re: [help]how to stop HDFS
In fact it's me to say sorry. I used the word install which was misleading.

In fact I downloaded a tar file and extracted it to /usr/bin/hadoop

Could you please tell me where to point those variables?

2011/11/30, Prashant Sharma prashant.ii...@gmail.com:
 I am sorry, I had no idea you have done a rpm install, my suggestion was
 based on the assumption that you have done a tar extract install where all
 three distribution have to extracted and then export variables.
 Also I have no experience with rpm based installs - so no comments about
 what went wrong in your case.

 Basically from the error i can say that it is not able to find the jars
 needed  on classpath which is referred by scripts through
 HADOOP_COMMON_HOME. I would say check with the access permission as in
 which user was it installed with and which user is it running with ?

 On Tue, Nov 29, 2011 at 10:48 PM, cat fa boost.subscrib...@gmail.comwrote:

 Thank you for your help, but I'm still a little confused.
 Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
 point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
 point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?

 2011/11/30 Prashant Sharma prashant.ii...@gmail.com

  I mean, you have to export the variables
 
  export HADOOP_CONF_DIR=/path/to/your/configdirectory.
 
  also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run your
  command. I suppose this should fix the problem.
  -P
 
  On Tue, Nov 29, 2011 at 6:23 PM, cat fa boost.subscrib...@gmail.com
  wrote:
 
   it didn't work. It gave me the Usage information.
  
   2011/11/29 hailong.yang1115 hailong.yang1...@gmail.com
  
Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
  $HADOOP_CONF_DIR
and 

Re: Re: [help]how to stop HDFS

2011-11-29 Thread hailong . yang1115
Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config $HADOOP_CONF_DIR and 
$HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config $HADOOP_CONF_DIR. It would 
stop namenode and datanode separately.
The HADOOP_CONF_DIR is the directory where you store your configuration files.
Hailong




***
* Hailong Yang, PhD. Candidate 
* Sino-German Joint Software Institute, 
* School of Computer ScienceEngineering, Beihang University
* Phone: (86-010)82315908
* Email: hailong.yang1...@gmail.com
* Address: G413, New Main Building in Beihang University, 
*  No.37 XueYuan Road,HaiDian District, 
*  Beijing,P.R.China,100191
***

From: cat fa
Date: 2011-11-29 20:22
To: common-user
Subject: Re: [help]how to stop HDFS
use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.

you mean which class? the class of hadoop or of java?

2011/11/29 Prashant Sharma prashant.ii...@gmail.com

 Try making $HADOOP_CONF point to right classpath including your
 configuration folder.


 On Tue, Nov 29, 2011 at 3:58 PM, cat fa boost.subscrib...@gmail.com
 wrote:

  I used the command :
 
  $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config $HADOOP_CONF_DIR
 
  to sart HDFS.
 
  This command is in Hadoop document (here
  
 
 http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
  )
 
  However, I got errors as
 
  Exception in thread main java.lang.NoClassDefFoundError:start
 
  Could anyone tell me how to start and stop HDFS?
 
  By the way, how to set Gmail so that it doesn't top post my reply?