Re: Re: [help]how to stop HDFS

2011-11-29 Thread hailong . yang1115
Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config $HADOOP_CONF_DIR and 
$HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config $HADOOP_CONF_DIR. It would 
stop namenode and datanode separately.
The HADOOP_CONF_DIR is the directory where you store your configuration files.
Hailong




***
* Hailong Yang, PhD. Candidate 
* Sino-German Joint Software Institute, 
* School of Computer Science&Engineering, Beihang University
* Phone: (86-010)82315908
* Email: hailong.yang1...@gmail.com
* Address: G413, New Main Building in Beihang University, 
*  No.37 XueYuan Road,HaiDian District, 
*  Beijing,P.R.China,100191
***

From: cat fa
Date: 2011-11-29 20:22
To: common-user
Subject: Re: [help]how to stop HDFS
use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.

you mean which class? the class of hadoop or of java?

2011/11/29 Prashant Sharma 

> Try making $HADOOP_CONF point to right classpath including your
> configuration folder.
>
>
> On Tue, Nov 29, 2011 at 3:58 PM, cat fa 
> wrote:
>
> > I used the command :
> >
> > $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config $HADOOP_CONF_DIR
> >
> > to sart HDFS.
> >
> > This command is in Hadoop document (here
> > <
> >
> http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
> > >)
> >
> > However, I got errors as
> >
> > Exception in thread "main" java.lang.NoClassDefFoundError:start
> >
> > Could anyone tell me how to start and stop HDFS?
> >
> > By the way, how to set Gmail so that it doesn't top post my reply?
> >
>

Re: Re: [help]how to stop HDFS

2011-11-29 Thread cat fa
it didn't work. It gave me the Usage information.

2011/11/29 hailong.yang1115 

> Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config $HADOOP_CONF_DIR
> and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config $HADOOP_CONF_DIR.
> It would stop namenode and datanode separately.
> The HADOOP_CONF_DIR is the directory where you store your configuration
> files.
> Hailong
>
>
>
>
> ***
> * Hailong Yang, PhD. Candidate
> * Sino-German Joint Software Institute,
> * School of Computer Science&Engineering, Beihang University
> * Phone: (86-010)82315908
> * Email: hailong.yang1...@gmail.com
> * Address: G413, New Main Building in Beihang University,
> *  No.37 XueYuan Road,HaiDian District,
> *  Beijing,P.R.China,100191
> ***
>
> From: cat fa
> Date: 2011-11-29 20:22
> To: common-user
> Subject: Re: [help]how to stop HDFS
> use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.
>
> you mean which class? the class of hadoop or of java?
>
> 2011/11/29 Prashant Sharma 
>
> > Try making $HADOOP_CONF point to right classpath including your
> > configuration folder.
> >
> >
> > On Tue, Nov 29, 2011 at 3:58 PM, cat fa 
> > wrote:
> >
> > > I used the command :
> > >
> > > $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config $HADOOP_CONF_DIR
> > >
> > > to sart HDFS.
> > >
> > > This command is in Hadoop document (here
> > > <
> > >
> >
> http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
> > > >)
> > >
> > > However, I got errors as
> > >
> > > Exception in thread "main" java.lang.NoClassDefFoundError:start
> > >
> > > Could anyone tell me how to start and stop HDFS?
> > >
> > > By the way, how to set Gmail so that it doesn't top post my reply?
> > >
> >
>


Re: Re: [help]how to stop HDFS

2011-11-29 Thread Prashant Sharma
I mean, you have to export the variables

export HADOOP_CONF_DIR=/path/to/your/configdirectory.

also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run your
command. I suppose this should fix the problem.
-P

On Tue, Nov 29, 2011 at 6:23 PM, cat fa  wrote:

> it didn't work. It gave me the Usage information.
>
> 2011/11/29 hailong.yang1115 
>
> > Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config $HADOOP_CONF_DIR
> > and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config $HADOOP_CONF_DIR.
> > It would stop namenode and datanode separately.
> > The HADOOP_CONF_DIR is the directory where you store your configuration
> > files.
> > Hailong
> >
> >
> >
> >
> > ***
> > * Hailong Yang, PhD. Candidate
> > * Sino-German Joint Software Institute,
> > * School of Computer Science&Engineering, Beihang University
> > * Phone: (86-010)82315908
> > * Email: hailong.yang1...@gmail.com
> > * Address: G413, New Main Building in Beihang University,
> > *  No.37 XueYuan Road,HaiDian District,
> > *  Beijing,P.R.China,100191
> > ***
> >
> > From: cat fa
> > Date: 2011-11-29 20:22
> > To: common-user
> > Subject: Re: [help]how to stop HDFS
> > use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.
> >
> > you mean which class? the class of hadoop or of java?
> >
> > 2011/11/29 Prashant Sharma 
> >
> > > Try making $HADOOP_CONF point to right classpath including your
> > > configuration folder.
> > >
> > >
> > > On Tue, Nov 29, 2011 at 3:58 PM, cat fa 
> > > wrote:
> > >
> > > > I used the command :
> > > >
> > > > $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config $HADOOP_CONF_DIR
> > > >
> > > > to sart HDFS.
> > > >
> > > > This command is in Hadoop document (here
> > > > <
> > > >
> > >
> >
> http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
> > > > >)
> > > >
> > > > However, I got errors as
> > > >
> > > > Exception in thread "main" java.lang.NoClassDefFoundError:start
> > > >
> > > > Could anyone tell me how to start and stop HDFS?
> > > >
> > > > By the way, how to set Gmail so that it doesn't top post my reply?
> > > >
> > >
> >
>


Re: Re: [help]how to stop HDFS

2011-11-29 Thread cat fa
Thank you for your help, but I'm still a little confused.
Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?

2011/11/30 Prashant Sharma 

> I mean, you have to export the variables
>
> export HADOOP_CONF_DIR=/path/to/your/configdirectory.
>
> also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run your
> command. I suppose this should fix the problem.
> -P
>
> On Tue, Nov 29, 2011 at 6:23 PM, cat fa 
> wrote:
>
> > it didn't work. It gave me the Usage information.
> >
> > 2011/11/29 hailong.yang1115 
> >
> > > Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
> $HADOOP_CONF_DIR
> > > and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config
> $HADOOP_CONF_DIR.
> > > It would stop namenode and datanode separately.
> > > The HADOOP_CONF_DIR is the directory where you store your configuration
> > > files.
> > > Hailong
> > >
> > >
> > >
> > >
> > > ***
> > > * Hailong Yang, PhD. Candidate
> > > * Sino-German Joint Software Institute,
> > > * School of Computer Science&Engineering, Beihang University
> > > * Phone: (86-010)82315908
> > > * Email: hailong.yang1...@gmail.com
> > > * Address: G413, New Main Building in Beihang University,
> > > *  No.37 XueYuan Road,HaiDian District,
> > > *  Beijing,P.R.China,100191
> > > ***
> > >
> > > From: cat fa
> > > Date: 2011-11-29 20:22
> > > To: common-user
> > > Subject: Re: [help]how to stop HDFS
> > > use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.
> > >
> > > you mean which class? the class of hadoop or of java?
> > >
> > > 2011/11/29 Prashant Sharma 
> > >
> > > > Try making $HADOOP_CONF point to right classpath including your
> > > > configuration folder.
> > > >
> > > >
> > > > On Tue, Nov 29, 2011 at 3:58 PM, cat fa  >
> > > > wrote:
> > > >
> > > > > I used the command :
> > > > >
> > > > > $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config
> $HADOOP_CONF_DIR
> > > > >
> > > > > to sart HDFS.
> > > > >
> > > > > This command is in Hadoop document (here
> > > > > <
> > > > >
> > > >
> > >
> >
> http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
> > > > > >)
> > > > >
> > > > > However, I got errors as
> > > > >
> > > > > Exception in thread "main" java.lang.NoClassDefFoundError:start
> > > > >
> > > > > Could anyone tell me how to start and stop HDFS?
> > > > >
> > > > > By the way, how to set Gmail so that it doesn't top post my reply?
> > > > >
> > > >
> > >
> >
>


Re: Re: [help]how to stop HDFS

2011-11-29 Thread Prashant Sharma
I am sorry, I had no idea you have done a rpm install, my suggestion was
based on the assumption that you have done a tar extract install where all
three distribution have to extracted and then export variables.
Also I have no experience with rpm based installs - so no comments about
what went wrong in your case.

Basically from the error i can say that it is not able to find the jars
needed  on classpath which is referred by scripts through
HADOOP_COMMON_HOME. I would say check with the access permission as in
which user was it installed with and which user is it running with ?

On Tue, Nov 29, 2011 at 10:48 PM, cat fa wrote:

> Thank you for your help, but I'm still a little confused.
> Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
> point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
> point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?
>
> 2011/11/30 Prashant Sharma 
>
> > I mean, you have to export the variables
> >
> > export HADOOP_CONF_DIR=/path/to/your/configdirectory.
> >
> > also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run your
> > command. I suppose this should fix the problem.
> > -P
> >
> > On Tue, Nov 29, 2011 at 6:23 PM, cat fa 
> > wrote:
> >
> > > it didn't work. It gave me the Usage information.
> > >
> > > 2011/11/29 hailong.yang1115 
> > >
> > > > Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
> > $HADOOP_CONF_DIR
> > > > and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config
> > $HADOOP_CONF_DIR.
> > > > It would stop namenode and datanode separately.
> > > > The HADOOP_CONF_DIR is the directory where you store your
> configuration
> > > > files.
> > > > Hailong
> > > >
> > > >
> > > >
> > > >
> > > > ***
> > > > * Hailong Yang, PhD. Candidate
> > > > * Sino-German Joint Software Institute,
> > > > * School of Computer Science&Engineering, Beihang University
> > > > * Phone: (86-010)82315908
> > > > * Email: hailong.yang1...@gmail.com
> > > > * Address: G413, New Main Building in Beihang University,
> > > > *  No.37 XueYuan Road,HaiDian District,
> > > > *  Beijing,P.R.China,100191
> > > > ***
> > > >
> > > > From: cat fa
> > > > Date: 2011-11-29 20:22
> > > > To: common-user
> > > > Subject: Re: [help]how to stop HDFS
> > > > use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.
> > > >
> > > > you mean which class? the class of hadoop or of java?
> > > >
> > > > 2011/11/29 Prashant Sharma 
> > > >
> > > > > Try making $HADOOP_CONF point to right classpath including your
> > > > > configuration folder.
> > > > >
> > > > >
> > > > > On Tue, Nov 29, 2011 at 3:58 PM, cat fa <
> boost.subscrib...@gmail.com
> > >
> > > > > wrote:
> > > > >
> > > > > > I used the command :
> > > > > >
> > > > > > $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config
> > $HADOOP_CONF_DIR
> > > > > >
> > > > > > to sart HDFS.
> > > > > >
> > > > > > This command is in Hadoop document (here
> > > > > > <
> > > > > >
> > > > >
> > > >
> > >
> >
> http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
> > > > > > >)
> > > > > >
> > > > > > However, I got errors as
> > > > > >
> > > > > > Exception in thread "main" java.lang.NoClassDefFoundError:start
> > > > > >
> > > > > > Could anyone tell me how to start and stop HDFS?
> > > > > >
> > > > > > By the way, how to set Gmail so that it doesn't top post my
> reply?
> > > > > >
> > > > >
> > > >
> > >
> >
>


Re: Re: [help]how to stop HDFS

2011-11-29 Thread cat fa
In fact it's me to say sorry. I used the word "install" which was misleading.

In fact I downloaded a tar file and extracted it to /usr/bin/hadoop

Could you please tell me where to point those variables?

2011/11/30, Prashant Sharma :
> I am sorry, I had no idea you have done a rpm install, my suggestion was
> based on the assumption that you have done a tar extract install where all
> three distribution have to extracted and then export variables.
> Also I have no experience with rpm based installs - so no comments about
> what went wrong in your case.
>
> Basically from the error i can say that it is not able to find the jars
> needed  on classpath which is referred by scripts through
> HADOOP_COMMON_HOME. I would say check with the access permission as in
> which user was it installed with and which user is it running with ?
>
> On Tue, Nov 29, 2011 at 10:48 PM, cat fa wrote:
>
>> Thank you for your help, but I'm still a little confused.
>> Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
>> point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
>> point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?
>>
>> 2011/11/30 Prashant Sharma 
>>
>> > I mean, you have to export the variables
>> >
>> > export HADOOP_CONF_DIR=/path/to/your/configdirectory.
>> >
>> > also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run your
>> > command. I suppose this should fix the problem.
>> > -P
>> >
>> > On Tue, Nov 29, 2011 at 6:23 PM, cat fa 
>> > wrote:
>> >
>> > > it didn't work. It gave me the Usage information.
>> > >
>> > > 2011/11/29 hailong.yang1115 
>> > >
>> > > > Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
>> > $HADOOP_CONF_DIR
>> > > > and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config
>> > $HADOOP_CONF_DIR.
>> > > > It would stop namenode and datanode separately.
>> > > > The HADOOP_CONF_DIR is the directory where you store your
>> configuration
>> > > > files.
>> > > > Hailong
>> > > >
>> > > >
>> > > >
>> > > >
>> > > > ***
>> > > > * Hailong Yang, PhD. Candidate
>> > > > * Sino-German Joint Software Institute,
>> > > > * School of Computer Science&Engineering, Beihang University
>> > > > * Phone: (86-010)82315908
>> > > > * Email: hailong.yang1...@gmail.com
>> > > > * Address: G413, New Main Building in Beihang University,
>> > > > *  No.37 XueYuan Road,HaiDian District,
>> > > > *  Beijing,P.R.China,100191
>> > > > ***
>> > > >
>> > > > From: cat fa
>> > > > Date: 2011-11-29 20:22
>> > > > To: common-user
>> > > > Subject: Re: [help]how to stop HDFS
>> > > > use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.
>> > > >
>> > > > you mean which class? the class of hadoop or of java?
>> > > >
>> > > > 2011/11/29 Prashant Sharma 
>> > > >
>> > > > > Try making $HADOOP_CONF point to right classpath including your
>> > > > > configuration folder.
>> > > > >
>> > > > >
>> > > > > On Tue, Nov 29, 2011 at 3:58 PM, cat fa <
>> boost.subscrib...@gmail.com
>> > >
>> > > > > wrote:
>> > > > >
>> > > > > > I used the command :
>> > > > > >
>> > > > > > $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config
>> > $HADOOP_CONF_DIR
>> > > > > >
>> > > > > > to sart HDFS.
>> > > > > >
>> > > > > > This command is in Hadoop document (here
>> > > > > > <
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>> http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
>> > > > > > >)
>> > > > > >
>> > > > > > However, I got errors as
>> > > > > >
>> > > > > > Exception in thread "main" java.lang.NoClassDefFoundError:start
>> > > > > >
>> > > > > > Could anyone tell me how to start and stop HDFS?
>> > > > > >
>> > > > > > By the way, how to set Gmail so that it doesn't top post my
>> reply?
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>>
>


Re: Re: [help]how to stop HDFS

2011-11-29 Thread Nitin Khandelwal
Hi,

Even i am facing the same problem. There may be some issue with script .
The doc says to start namenode type :
bin/hdfs namenode start

But "start" is not recognized. There is a hack to start namenode with
command "bin/hdfs namenode &" , but no idea how to stop.
If it had been a issue with config , the later also should not have worked.

Thanks,
Nitin


2011/11/30 cat fa 

> In fact it's me to say sorry. I used the word "install" which was
> misleading.
>
> In fact I downloaded a tar file and extracted it to /usr/bin/hadoop
>
> Could you please tell me where to point those variables?
>
> 2011/11/30, Prashant Sharma :
> > I am sorry, I had no idea you have done a rpm install, my suggestion was
> > based on the assumption that you have done a tar extract install where
> all
> > three distribution have to extracted and then export variables.
> > Also I have no experience with rpm based installs - so no comments about
> > what went wrong in your case.
> >
> > Basically from the error i can say that it is not able to find the jars
> > needed  on classpath which is referred by scripts through
> > HADOOP_COMMON_HOME. I would say check with the access permission as in
> > which user was it installed with and which user is it running with ?
> >
> > On Tue, Nov 29, 2011 at 10:48 PM, cat fa  >wrote:
> >
> >> Thank you for your help, but I'm still a little confused.
> >> Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
> >> point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
> >> point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?
> >>
> >> 2011/11/30 Prashant Sharma 
> >>
> >> > I mean, you have to export the variables
> >> >
> >> > export HADOOP_CONF_DIR=/path/to/your/configdirectory.
> >> >
> >> > also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run your
> >> > command. I suppose this should fix the problem.
> >> > -P
> >> >
> >> > On Tue, Nov 29, 2011 at 6:23 PM, cat fa 
> >> > wrote:
> >> >
> >> > > it didn't work. It gave me the Usage information.
> >> > >
> >> > > 2011/11/29 hailong.yang1115 
> >> > >
> >> > > > Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
> >> > $HADOOP_CONF_DIR
> >> > > > and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config
> >> > $HADOOP_CONF_DIR.
> >> > > > It would stop namenode and datanode separately.
> >> > > > The HADOOP_CONF_DIR is the directory where you store your
> >> configuration
> >> > > > files.
> >> > > > Hailong
> >> > > >
> >> > > >
> >> > > >
> >> > > >
> >> > > > ***
> >> > > > * Hailong Yang, PhD. Candidate
> >> > > > * Sino-German Joint Software Institute,
> >> > > > * School of Computer Science&Engineering, Beihang University
> >> > > > * Phone: (86-010)82315908
> >> > > > * Email: hailong.yang1...@gmail.com
> >> > > > * Address: G413, New Main Building in Beihang University,
> >> > > > *  No.37 XueYuan Road,HaiDian District,
> >> > > > *  Beijing,P.R.China,100191
> >> > > > ***
> >> > > >
> >> > > > From: cat fa
> >> > > > Date: 2011-11-29 20:22
> >> > > > To: common-user
> >> > > > Subject: Re: [help]how to stop HDFS
> >> > > > use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.
> >> > > >
> >> > > > you mean which class? the class of hadoop or of java?
> >> > > >
> >> > > > 2011/11/29 Prashant Sharma 
> >> > > >
> >> > > > > Try making $HADOOP_CONF point to right classpath including your
> >> > > > > configuration folder.
> >> > > > >
> >> > > > >
> >> > > > > On Tue, Nov 29, 2011 at 3:58 PM, cat fa <
> >> boost.subscrib...@gmail.com
> >> > >
> >> > > > > wrote:
> >> > > > >
> >> > > > > > I used the command :
> >> > > > > >
> >> > > > > > $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config
> >> > $HADOOP_CONF_DIR
> >> > > > > >
> >> > > > > > to sart HDFS.
> >> > > > > >
> >> > > > > > This command is in Hadoop document (here
> >> > > > > > <
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
> >> > > > > > >)
> >> > > > > >
> >> > > > > > However, I got errors as
> >> > > > > >
> >> > > > > > Exception in thread "main"
> java.lang.NoClassDefFoundError:start
> >> > > > > >
> >> > > > > > Could anyone tell me how to start and stop HDFS?
> >> > > > > >
> >> > > > > > By the way, how to set Gmail so that it doesn't top post my
> >> reply?
> >> > > > > >
> >> > > > >
> >> > > >
> >> > >
> >> >
> >>
> >
>



-- 


Nitin Khandelwal


Re: Re: [help]how to stop HDFS

2011-11-29 Thread Harsh J
I simply use the /sbin/hadoop-daemon.sh {start|stop} {service} script
to control daemons at my end.

Does this not work for you? Or perhaps this thread is more about
documenting that?

2011/11/30 Nitin Khandelwal :
> Hi,
>
> Even i am facing the same problem. There may be some issue with script .
> The doc says to start namenode type :
> bin/hdfs namenode start
>
> But "start" is not recognized. There is a hack to start namenode with
> command "bin/hdfs namenode &" , but no idea how to stop.
> If it had been a issue with config , the later also should not have worked.
>
> Thanks,
> Nitin
>
>
> 2011/11/30 cat fa 
>
>> In fact it's me to say sorry. I used the word "install" which was
>> misleading.
>>
>> In fact I downloaded a tar file and extracted it to /usr/bin/hadoop
>>
>> Could you please tell me where to point those variables?
>>
>> 2011/11/30, Prashant Sharma :
>> > I am sorry, I had no idea you have done a rpm install, my suggestion was
>> > based on the assumption that you have done a tar extract install where
>> all
>> > three distribution have to extracted and then export variables.
>> > Also I have no experience with rpm based installs - so no comments about
>> > what went wrong in your case.
>> >
>> > Basically from the error i can say that it is not able to find the jars
>> > needed  on classpath which is referred by scripts through
>> > HADOOP_COMMON_HOME. I would say check with the access permission as in
>> > which user was it installed with and which user is it running with ?
>> >
>> > On Tue, Nov 29, 2011 at 10:48 PM, cat fa > >wrote:
>> >
>> >> Thank you for your help, but I'm still a little confused.
>> >> Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
>> >> point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
>> >> point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?
>> >>
>> >> 2011/11/30 Prashant Sharma 
>> >>
>> >> > I mean, you have to export the variables
>> >> >
>> >> > export HADOOP_CONF_DIR=/path/to/your/configdirectory.
>> >> >
>> >> > also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run your
>> >> > command. I suppose this should fix the problem.
>> >> > -P
>> >> >
>> >> > On Tue, Nov 29, 2011 at 6:23 PM, cat fa 
>> >> > wrote:
>> >> >
>> >> > > it didn't work. It gave me the Usage information.
>> >> > >
>> >> > > 2011/11/29 hailong.yang1115 
>> >> > >
>> >> > > > Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
>> >> > $HADOOP_CONF_DIR
>> >> > > > and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config
>> >> > $HADOOP_CONF_DIR.
>> >> > > > It would stop namenode and datanode separately.
>> >> > > > The HADOOP_CONF_DIR is the directory where you store your
>> >> configuration
>> >> > > > files.
>> >> > > > Hailong
>> >> > > >
>> >> > > >
>> >> > > >
>> >> > > >
>> >> > > > ***
>> >> > > > * Hailong Yang, PhD. Candidate
>> >> > > > * Sino-German Joint Software Institute,
>> >> > > > * School of Computer Science&Engineering, Beihang University
>> >> > > > * Phone: (86-010)82315908
>> >> > > > * Email: hailong.yang1...@gmail.com
>> >> > > > * Address: G413, New Main Building in Beihang University,
>> >> > > > *  No.37 XueYuan Road,HaiDian District,
>> >> > > > *  Beijing,P.R.China,100191
>> >> > > > ***
>> >> > > >
>> >> > > > From: cat fa
>> >> > > > Date: 2011-11-29 20:22
>> >> > > > To: common-user
>> >> > > > Subject: Re: [help]how to stop HDFS
>> >> > > > use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.
>> >> > > >
>> >> > > > you mean which class? the class of hadoop or of java?
>> >> > > >
>> >> > > > 2011/11/29 Prashant Sharma 
>> >> > > >
>> >> > > > > Try making $HADOOP_CONF point to right classpath including your
>> >> > > > > configuration folder.
>> >> > > > >
>> >> > > > >
>> >> > > > > On Tue, Nov 29, 2011 at 3:58 PM, cat fa <
>> >> boost.subscrib...@gmail.com
>> >> > >
>> >> > > > > wrote:
>> >> > > > >
>> >> > > > > > I used the command :
>> >> > > > > >
>> >> > > > > > $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config
>> >> > $HADOOP_CONF_DIR
>> >> > > > > >
>> >> > > > > > to sart HDFS.
>> >> > > > > >
>> >> > > > > > This command is in Hadoop document (here
>> >> > > > > > <
>> >> > > > > >
>> >> > > > >
>> >> > > >
>> >> > >
>> >> >
>> >>
>> http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
>> >> > > > > > >)
>> >> > > > > >
>> >> > > > > > However, I got errors as
>> >> > > > > >
>> >> > > > > > Exception in thread "main"
>> java.lang.NoClassDefFoundError:start
>> >> > > > > >
>> >> > > > > > Could anyone tell me how to start and stop HDFS?
>> >> > > > > >
>> >> > > > > > By the way, how to set Gmail so that it doesn't top post my
>> >> reply?
>> >> > > > > >
>> >> > > > >
>> >> > > >
>> >> > >
>> >> >
>> >>
>> >
>>
>
>
>
> --
>
>
> Nitin Khandelwal
>



-- 
Harsh J


Re: Re: [help]how to stop HDFS

2011-11-29 Thread Nitin Khandelwal
I am using Hadoop 0.23.0
There is no hadoop-daemon.sh in bin directory..

Thanks,
Nitin

On 30 November 2011 09:49, Harsh J  wrote:

> I simply use the /sbin/hadoop-daemon.sh {start|stop} {service} script
> to control daemons at my end.
>
> Does this not work for you? Or perhaps this thread is more about
> documenting that?
>
> 2011/11/30 Nitin Khandelwal :
> > Hi,
> >
> > Even i am facing the same problem. There may be some issue with script .
> > The doc says to start namenode type :
> > bin/hdfs namenode start
> >
> > But "start" is not recognized. There is a hack to start namenode with
> > command "bin/hdfs namenode &" , but no idea how to stop.
> > If it had been a issue with config , the later also should not have
> worked.
> >
> > Thanks,
> > Nitin
> >
> >
> > 2011/11/30 cat fa 
> >
> >> In fact it's me to say sorry. I used the word "install" which was
> >> misleading.
> >>
> >> In fact I downloaded a tar file and extracted it to /usr/bin/hadoop
> >>
> >> Could you please tell me where to point those variables?
> >>
> >> 2011/11/30, Prashant Sharma :
> >> > I am sorry, I had no idea you have done a rpm install, my suggestion
> was
> >> > based on the assumption that you have done a tar extract install where
> >> all
> >> > three distribution have to extracted and then export variables.
> >> > Also I have no experience with rpm based installs - so no comments
> about
> >> > what went wrong in your case.
> >> >
> >> > Basically from the error i can say that it is not able to find the
> jars
> >> > needed  on classpath which is referred by scripts through
> >> > HADOOP_COMMON_HOME. I would say check with the access permission as in
> >> > which user was it installed with and which user is it running with ?
> >> >
> >> > On Tue, Nov 29, 2011 at 10:48 PM, cat fa  >> >wrote:
> >> >
> >> >> Thank you for your help, but I'm still a little confused.
> >> >> Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
> >> >> point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
> >> >> point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?
> >> >>
> >> >> 2011/11/30 Prashant Sharma 
> >> >>
> >> >> > I mean, you have to export the variables
> >> >> >
> >> >> > export HADOOP_CONF_DIR=/path/to/your/configdirectory.
> >> >> >
> >> >> > also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run
> your
> >> >> > command. I suppose this should fix the problem.
> >> >> > -P
> >> >> >
> >> >> > On Tue, Nov 29, 2011 at 6:23 PM, cat fa <
> boost.subscrib...@gmail.com>
> >> >> > wrote:
> >> >> >
> >> >> > > it didn't work. It gave me the Usage information.
> >> >> > >
> >> >> > > 2011/11/29 hailong.yang1115 
> >> >> > >
> >> >> > > > Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
> >> >> > $HADOOP_CONF_DIR
> >> >> > > > and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config
> >> >> > $HADOOP_CONF_DIR.
> >> >> > > > It would stop namenode and datanode separately.
> >> >> > > > The HADOOP_CONF_DIR is the directory where you store your
> >> >> configuration
> >> >> > > > files.
> >> >> > > > Hailong
> >> >> > > >
> >> >> > > >
> >> >> > > >
> >> >> > > >
> >> >> > > > ***
> >> >> > > > * Hailong Yang, PhD. Candidate
> >> >> > > > * Sino-German Joint Software Institute,
> >> >> > > > * School of Computer Science&Engineering, Beihang University
> >> >> > > > * Phone: (86-010)82315908
> >> >> > > > * Email: hailong.yang1...@gmail.com
> >> >> > > > * Address: G413, New Main Building in Beihang University,
> >> >> > > > *  No.37 XueYuan Road,HaiDian District,
> >> >> > > > *  Beijing,P.R.China,100191
> >> >> > > > ***
> >> >> > > >
> >> >> > > > From: cat fa
> >> >> > > > Date: 2011-11-29 20:22
> >> >> > > > To: common-user
> >> >> > > > Subject: Re: [help]how to stop HDFS
> >> >> > > > use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.
> >> >> > > >
> >> >> > > > you mean which class? the class of hadoop or of java?
> >> >> > > >
> >> >> > > > 2011/11/29 Prashant Sharma 
> >> >> > > >
> >> >> > > > > Try making $HADOOP_CONF point to right classpath including
> your
> >> >> > > > > configuration folder.
> >> >> > > > >
> >> >> > > > >
> >> >> > > > > On Tue, Nov 29, 2011 at 3:58 PM, cat fa <
> >> >> boost.subscrib...@gmail.com
> >> >> > >
> >> >> > > > > wrote:
> >> >> > > > >
> >> >> > > > > > I used the command :
> >> >> > > > > >
> >> >> > > > > > $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config
> >> >> > $HADOOP_CONF_DIR
> >> >> > > > > >
> >> >> > > > > > to sart HDFS.
> >> >> > > > > >
> >> >> > > > > > This command is in Hadoop document (here
> >> >> > > > > > <
> >> >> > > > > >
> >> >> > > > >
> >> >> > > >
> >> >> > >
> >> >> >
> >> >>
> >>
> http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
> >> >> > > > > > >)
> >> >> > > > > >
> >> >> > > > > > However, I got errors as
> >> >> > > > > >
> >> >> > > > > > Exception in thread "main"
>

Re: Re: [help]how to stop HDFS

2011-11-29 Thread Harsh J
Like I wrote earlier, its in the $HADOOP_HOME/sbin directory. Not the
regular bin/ directory.

On Wed, Nov 30, 2011 at 9:52 AM, Nitin Khandelwal
 wrote:
> I am using Hadoop 0.23.0
> There is no hadoop-daemon.sh in bin directory..
>
> Thanks,
> Nitin
>
> On 30 November 2011 09:49, Harsh J  wrote:
>
>> I simply use the /sbin/hadoop-daemon.sh {start|stop} {service} script
>> to control daemons at my end.
>>
>> Does this not work for you? Or perhaps this thread is more about
>> documenting that?
>>
>> 2011/11/30 Nitin Khandelwal :
>> > Hi,
>> >
>> > Even i am facing the same problem. There may be some issue with script .
>> > The doc says to start namenode type :
>> > bin/hdfs namenode start
>> >
>> > But "start" is not recognized. There is a hack to start namenode with
>> > command "bin/hdfs namenode &" , but no idea how to stop.
>> > If it had been a issue with config , the later also should not have
>> worked.
>> >
>> > Thanks,
>> > Nitin
>> >
>> >
>> > 2011/11/30 cat fa 
>> >
>> >> In fact it's me to say sorry. I used the word "install" which was
>> >> misleading.
>> >>
>> >> In fact I downloaded a tar file and extracted it to /usr/bin/hadoop
>> >>
>> >> Could you please tell me where to point those variables?
>> >>
>> >> 2011/11/30, Prashant Sharma :
>> >> > I am sorry, I had no idea you have done a rpm install, my suggestion
>> was
>> >> > based on the assumption that you have done a tar extract install where
>> >> all
>> >> > three distribution have to extracted and then export variables.
>> >> > Also I have no experience with rpm based installs - so no comments
>> about
>> >> > what went wrong in your case.
>> >> >
>> >> > Basically from the error i can say that it is not able to find the
>> jars
>> >> > needed  on classpath which is referred by scripts through
>> >> > HADOOP_COMMON_HOME. I would say check with the access permission as in
>> >> > which user was it installed with and which user is it running with ?
>> >> >
>> >> > On Tue, Nov 29, 2011 at 10:48 PM, cat fa > >> >wrote:
>> >> >
>> >> >> Thank you for your help, but I'm still a little confused.
>> >> >> Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
>> >> >> point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
>> >> >> point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?
>> >> >>
>> >> >> 2011/11/30 Prashant Sharma 
>> >> >>
>> >> >> > I mean, you have to export the variables
>> >> >> >
>> >> >> > export HADOOP_CONF_DIR=/path/to/your/configdirectory.
>> >> >> >
>> >> >> > also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run
>> your
>> >> >> > command. I suppose this should fix the problem.
>> >> >> > -P
>> >> >> >
>> >> >> > On Tue, Nov 29, 2011 at 6:23 PM, cat fa <
>> boost.subscrib...@gmail.com>
>> >> >> > wrote:
>> >> >> >
>> >> >> > > it didn't work. It gave me the Usage information.
>> >> >> > >
>> >> >> > > 2011/11/29 hailong.yang1115 
>> >> >> > >
>> >> >> > > > Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
>> >> >> > $HADOOP_CONF_DIR
>> >> >> > > > and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config
>> >> >> > $HADOOP_CONF_DIR.
>> >> >> > > > It would stop namenode and datanode separately.
>> >> >> > > > The HADOOP_CONF_DIR is the directory where you store your
>> >> >> configuration
>> >> >> > > > files.
>> >> >> > > > Hailong
>> >> >> > > >
>> >> >> > > >
>> >> >> > > >
>> >> >> > > >
>> >> >> > > > ***
>> >> >> > > > * Hailong Yang, PhD. Candidate
>> >> >> > > > * Sino-German Joint Software Institute,
>> >> >> > > > * School of Computer Science&Engineering, Beihang University
>> >> >> > > > * Phone: (86-010)82315908
>> >> >> > > > * Email: hailong.yang1...@gmail.com
>> >> >> > > > * Address: G413, New Main Building in Beihang University,
>> >> >> > > > *              No.37 XueYuan Road,HaiDian District,
>> >> >> > > > *              Beijing,P.R.China,100191
>> >> >> > > > ***
>> >> >> > > >
>> >> >> > > > From: cat fa
>> >> >> > > > Date: 2011-11-29 20:22
>> >> >> > > > To: common-user
>> >> >> > > > Subject: Re: [help]how to stop HDFS
>> >> >> > > > use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.
>> >> >> > > >
>> >> >> > > > you mean which class? the class of hadoop or of java?
>> >> >> > > >
>> >> >> > > > 2011/11/29 Prashant Sharma 
>> >> >> > > >
>> >> >> > > > > Try making $HADOOP_CONF point to right classpath including
>> your
>> >> >> > > > > configuration folder.
>> >> >> > > > >
>> >> >> > > > >
>> >> >> > > > > On Tue, Nov 29, 2011 at 3:58 PM, cat fa <
>> >> >> boost.subscrib...@gmail.com
>> >> >> > >
>> >> >> > > > > wrote:
>> >> >> > > > >
>> >> >> > > > > > I used the command :
>> >> >> > > > > >
>> >> >> > > > > > $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config
>> >> >> > $HADOOP_CONF_DIR
>> >> >> > > > > >
>> >> >> > > > > > to sart HDFS.
>> >> >> > > > > >
>> >> >> > > > > > This command is in Hadoop document (here
>> >> >> > > > > > <
>> >> >> > > > > >
>> >> >> > 

Re: Re: [help]how to stop HDFS

2011-11-29 Thread Nitin Khandelwal
Thanks,
I missed the "sbin" directory, was using the normal bin directory.
Thanks,
Nitin

On 30 November 2011 09:54, Harsh J  wrote:

> Like I wrote earlier, its in the $HADOOP_HOME/sbin directory. Not the
> regular bin/ directory.
>
> On Wed, Nov 30, 2011 at 9:52 AM, Nitin Khandelwal
>  wrote:
> > I am using Hadoop 0.23.0
> > There is no hadoop-daemon.sh in bin directory..
> >
> > Thanks,
> > Nitin
> >
> > On 30 November 2011 09:49, Harsh J  wrote:
> >
> >> I simply use the /sbin/hadoop-daemon.sh {start|stop} {service} script
> >> to control daemons at my end.
> >>
> >> Does this not work for you? Or perhaps this thread is more about
> >> documenting that?
> >>
> >> 2011/11/30 Nitin Khandelwal :
> >> > Hi,
> >> >
> >> > Even i am facing the same problem. There may be some issue with
> script .
> >> > The doc says to start namenode type :
> >> > bin/hdfs namenode start
> >> >
> >> > But "start" is not recognized. There is a hack to start namenode with
> >> > command "bin/hdfs namenode &" , but no idea how to stop.
> >> > If it had been a issue with config , the later also should not have
> >> worked.
> >> >
> >> > Thanks,
> >> > Nitin
> >> >
> >> >
> >> > 2011/11/30 cat fa 
> >> >
> >> >> In fact it's me to say sorry. I used the word "install" which was
> >> >> misleading.
> >> >>
> >> >> In fact I downloaded a tar file and extracted it to /usr/bin/hadoop
> >> >>
> >> >> Could you please tell me where to point those variables?
> >> >>
> >> >> 2011/11/30, Prashant Sharma :
> >> >> > I am sorry, I had no idea you have done a rpm install, my
> suggestion
> >> was
> >> >> > based on the assumption that you have done a tar extract install
> where
> >> >> all
> >> >> > three distribution have to extracted and then export variables.
> >> >> > Also I have no experience with rpm based installs - so no comments
> >> about
> >> >> > what went wrong in your case.
> >> >> >
> >> >> > Basically from the error i can say that it is not able to find the
> >> jars
> >> >> > needed  on classpath which is referred by scripts through
> >> >> > HADOOP_COMMON_HOME. I would say check with the access permission
> as in
> >> >> > which user was it installed with and which user is it running with
> ?
> >> >> >
> >> >> > On Tue, Nov 29, 2011 at 10:48 PM, cat fa <
> boost.subscrib...@gmail.com
> >> >> >wrote:
> >> >> >
> >> >> >> Thank you for your help, but I'm still a little confused.
> >> >> >> Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
> >> >> >> point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
> >> >> >> point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?
> >> >> >>
> >> >> >> 2011/11/30 Prashant Sharma 
> >> >> >>
> >> >> >> > I mean, you have to export the variables
> >> >> >> >
> >> >> >> > export HADOOP_CONF_DIR=/path/to/your/configdirectory.
> >> >> >> >
> >> >> >> > also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your
> run
> >> your
> >> >> >> > command. I suppose this should fix the problem.
> >> >> >> > -P
> >> >> >> >
> >> >> >> > On Tue, Nov 29, 2011 at 6:23 PM, cat fa <
> >> boost.subscrib...@gmail.com>
> >> >> >> > wrote:
> >> >> >> >
> >> >> >> > > it didn't work. It gave me the Usage information.
> >> >> >> > >
> >> >> >> > > 2011/11/29 hailong.yang1115 
> >> >> >> > >
> >> >> >> > > > Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
> >> >> >> > $HADOOP_CONF_DIR
> >> >> >> > > > and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config
> >> >> >> > $HADOOP_CONF_DIR.
> >> >> >> > > > It would stop namenode and datanode separately.
> >> >> >> > > > The HADOOP_CONF_DIR is the directory where you store your
> >> >> >> configuration
> >> >> >> > > > files.
> >> >> >> > > > Hailong
> >> >> >> > > >
> >> >> >> > > >
> >> >> >> > > >
> >> >> >> > > >
> >> >> >> > > > ***
> >> >> >> > > > * Hailong Yang, PhD. Candidate
> >> >> >> > > > * Sino-German Joint Software Institute,
> >> >> >> > > > * School of Computer Science&Engineering, Beihang University
> >> >> >> > > > * Phone: (86-010)82315908
> >> >> >> > > > * Email: hailong.yang1...@gmail.com
> >> >> >> > > > * Address: G413, New Main Building in Beihang University,
> >> >> >> > > > *  No.37 XueYuan Road,HaiDian District,
> >> >> >> > > > *  Beijing,P.R.China,100191
> >> >> >> > > > ***
> >> >> >> > > >
> >> >> >> > > > From: cat fa
> >> >> >> > > > Date: 2011-11-29 20:22
> >> >> >> > > > To: common-user
> >> >> >> > > > Subject: Re: [help]how to stop HDFS
> >> >> >> > > > use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop
> 0.23.
> >> >> >> > > >
> >> >> >> > > > you mean which class? the class of hadoop or of java?
> >> >> >> > > >
> >> >> >> > > > 2011/11/29 Prashant Sharma 
> >> >> >> > > >
> >> >> >> > > > > Try making $HADOOP_CONF point to right classpath including
> >> your
> >> >> >> > > > > configuration folder.
> >> >> >> > > > >
> >> >> >> > > > >
> >> >> >> > > > > On Tue, Nov 29, 2011 at 3:58 PM, cat fa 

Re: Re: [help]how to stop HDFS

2011-11-30 Thread cat fa
Thank you for your help.
I can use /sbin/hadoop-daemon.sh {start|stop} {service} script to start a
namenode, but I can't start a resourcemanager.

2011/11/30 Harsh J 

> I simply use the /sbin/hadoop-daemon.sh {start|stop} {service} script
> to control daemons at my end.
>
> Does this not work for you? Or perhaps this thread is more about
> documenting that?
>
> 2011/11/30 Nitin Khandelwal :
> > Hi,
> >
> > Even i am facing the same problem. There may be some issue with script .
> > The doc says to start namenode type :
> > bin/hdfs namenode start
> >
> > But "start" is not recognized. There is a hack to start namenode with
> > command "bin/hdfs namenode &" , but no idea how to stop.
> > If it had been a issue with config , the later also should not have
> worked.
> >
> > Thanks,
> > Nitin
> >
> >
> > 2011/11/30 cat fa 
> >
> >> In fact it's me to say sorry. I used the word "install" which was
> >> misleading.
> >>
> >> In fact I downloaded a tar file and extracted it to /usr/bin/hadoop
> >>
> >> Could you please tell me where to point those variables?
> >>
> >> 2011/11/30, Prashant Sharma :
> >> > I am sorry, I had no idea you have done a rpm install, my suggestion
> was
> >> > based on the assumption that you have done a tar extract install where
> >> all
> >> > three distribution have to extracted and then export variables.
> >> > Also I have no experience with rpm based installs - so no comments
> about
> >> > what went wrong in your case.
> >> >
> >> > Basically from the error i can say that it is not able to find the
> jars
> >> > needed  on classpath which is referred by scripts through
> >> > HADOOP_COMMON_HOME. I would say check with the access permission as in
> >> > which user was it installed with and which user is it running with ?
> >> >
> >> > On Tue, Nov 29, 2011 at 10:48 PM, cat fa  >> >wrote:
> >> >
> >> >> Thank you for your help, but I'm still a little confused.
> >> >> Suppose I installed hadoop in /usr/bin/hadoop/ .Should I
> >> >> point HADOOP_COMMON_HOME to /usr/bin/hadoop ? Where should I
> >> >> point HADOOP_HDFS_HOME? Also to /usr/bin/hadoop/ ?
> >> >>
> >> >> 2011/11/30 Prashant Sharma 
> >> >>
> >> >> > I mean, you have to export the variables
> >> >> >
> >> >> > export HADOOP_CONF_DIR=/path/to/your/configdirectory.
> >> >> >
> >> >> > also export HADOOP_HDFS_HOME ,HADOOP_COMMON_HOME. before your run
> your
> >> >> > command. I suppose this should fix the problem.
> >> >> > -P
> >> >> >
> >> >> > On Tue, Nov 29, 2011 at 6:23 PM, cat fa <
> boost.subscrib...@gmail.com>
> >> >> > wrote:
> >> >> >
> >> >> > > it didn't work. It gave me the Usage information.
> >> >> > >
> >> >> > > 2011/11/29 hailong.yang1115 
> >> >> > >
> >> >> > > > Try $HADOOP_PREFIX_HOME/bin/hdfs namenode stop --config
> >> >> > $HADOOP_CONF_DIR
> >> >> > > > and $HADOOP_PREFIX_HOME/bin/hdfs datanode stop --config
> >> >> > $HADOOP_CONF_DIR.
> >> >> > > > It would stop namenode and datanode separately.
> >> >> > > > The HADOOP_CONF_DIR is the directory where you store your
> >> >> configuration
> >> >> > > > files.
> >> >> > > > Hailong
> >> >> > > >
> >> >> > > >
> >> >> > > >
> >> >> > > >
> >> >> > > > ***
> >> >> > > > * Hailong Yang, PhD. Candidate
> >> >> > > > * Sino-German Joint Software Institute,
> >> >> > > > * School of Computer Science&Engineering, Beihang University
> >> >> > > > * Phone: (86-010)82315908
> >> >> > > > * Email: hailong.yang1...@gmail.com
> >> >> > > > * Address: G413, New Main Building in Beihang University,
> >> >> > > > *  No.37 XueYuan Road,HaiDian District,
> >> >> > > > *  Beijing,P.R.China,100191
> >> >> > > > ***
> >> >> > > >
> >> >> > > > From: cat fa
> >> >> > > > Date: 2011-11-29 20:22
> >> >> > > > To: common-user
> >> >> > > > Subject: Re: [help]how to stop HDFS
> >> >> > > > use $HADOOP_CONF or $HADOOP_CONF_DIR ? I'm using hadoop 0.23.
> >> >> > > >
> >> >> > > > you mean which class? the class of hadoop or of java?
> >> >> > > >
> >> >> > > > 2011/11/29 Prashant Sharma 
> >> >> > > >
> >> >> > > > > Try making $HADOOP_CONF point to right classpath including
> your
> >> >> > > > > configuration folder.
> >> >> > > > >
> >> >> > > > >
> >> >> > > > > On Tue, Nov 29, 2011 at 3:58 PM, cat fa <
> >> >> boost.subscrib...@gmail.com
> >> >> > >
> >> >> > > > > wrote:
> >> >> > > > >
> >> >> > > > > > I used the command :
> >> >> > > > > >
> >> >> > > > > > $HADOOP_PREFIX_HOME/bin/hdfs start namenode --config
> >> >> > $HADOOP_CONF_DIR
> >> >> > > > > >
> >> >> > > > > > to sart HDFS.
> >> >> > > > > >
> >> >> > > > > > This command is in Hadoop document (here
> >> >> > > > > > <
> >> >> > > > > >
> >> >> > > > >
> >> >> > > >
> >> >> > >
> >> >> >
> >> >>
> >>
> http://hadoop.apache.org/common/docs/r0.23.0/hadoop-yarn/hadoop-yarn-site/ClusterSetup.html
> >> >> > > > > > >)
> >> >> > > > > >
> >> >> > > > > > However, I got errors as
> >> >> > > > > >
> >> >> > >