Re: Why $HADOOP_PREFIX ?

2012-02-01 Thread Prashant Sharma
I think you have misunderstood something. AFAIK or understand  these
variables are set automatically when you run a script. it's name is obscure
for some strange reason. ;).

Warning: $HADOOP_HOME is deprecated is always there. whether the variable
is set or not. Why?
Because the hadoop-config is sourced in all scripts. And all it does is
sets HADOOP_PREFIX as HADOOP_HOME. I think this can be reported as a bug.

-P


On Wed, Feb 1, 2012 at 5:46 PM, praveenesh kumar wrote:

> Does anyone have idea on Why $HADOOP_PREFIX was introduced instead of
> $HADOOP_HOME in hadoop 0.20.205 ?
>
> I believe $HADOOP_HOME was not giving any troubles or is there a reason/new
> feature that require $HADOOP_PREFIX to be added ?
>
> Its a kind of funny, but I got habitual of using $HADOOP_HOME. Just curious
> to know for this change.
> Also, there are some old packages ( I am not referring apache/cloudera/or
> any hadoop distribution ), that depends on hadoop that still uses
> $HADOOP_HOME inside. So its kind of weird when you use those packages, you
> still get warning messages even though its suppressed from Hadoop side.
>
>
> Thanks,
> Praveenesh
>


Re: Why $HADOOP_PREFIX ?

2012-02-01 Thread praveenesh kumar
Interesting and strange.
but are there any reason for setting $HADOOP_HOME to $HADOOP_PREFIX in
hadoop-conf.sh
and then checking in /bin/hadoop.sh whether $HADOOP_HOME is not equal to ""

I mean if I comment out the export HADOOP_HOME=${HADOOP_PREFIX} in
hadoop-conf.sh, does it make any difference ?

Thanks,
Praveenesh

On Wed, Feb 1, 2012 at 6:04 PM, Prashant Sharma wrote:

> I think you have misunderstood something. AFAIK or understand  these
> variables are set automatically when you run a script. it's name is obscure
> for some strange reason. ;).
>
> Warning: $HADOOP_HOME is deprecated is always there. whether the variable
> is set or not. Why?
> Because the hadoop-config is sourced in all scripts. And all it does is
> sets HADOOP_PREFIX as HADOOP_HOME. I think this can be reported as a bug.
>
> -P
>
>
> On Wed, Feb 1, 2012 at 5:46 PM, praveenesh kumar  >wrote:
>
> > Does anyone have idea on Why $HADOOP_PREFIX was introduced instead of
> > $HADOOP_HOME in hadoop 0.20.205 ?
> >
> > I believe $HADOOP_HOME was not giving any troubles or is there a
> reason/new
> > feature that require $HADOOP_PREFIX to be added ?
> >
> > Its a kind of funny, but I got habitual of using $HADOOP_HOME. Just
> curious
> > to know for this change.
> > Also, there are some old packages ( I am not referring apache/cloudera/or
> > any hadoop distribution ), that depends on hadoop that still uses
> > $HADOOP_HOME inside. So its kind of weird when you use those packages,
> you
> > still get warning messages even though its suppressed from Hadoop side.
> >
> >
> > Thanks,
> > Praveenesh
> >
>


Re: Why $HADOOP_PREFIX ?

2012-02-01 Thread Robert Evans
I think it comes down to a long history of splitting and then remerging the 
hadoop project.  I could be wrong about a lot of this so take it worth a grain 
of salt.  Hadoop originally, and still is on 1.0 a single project.  HDFS, 
mapreduce and common are all compiled together into a single jar hadoop-core.  
In that respect HADOOP_HOME made a lot of since because it was a single thing, 
with some dependencies that needed to be found by some shell scripts.

Fast forward the projects were split, HADOOP_HOME was deprecated, and 
HADOOP_COMMON_HOME, HADOOP_MAPRED_HOME, and HADOOP_HDFS_HOME were born.  But if 
we install them all into a single tree it is a pain to configure all of these 
to point to the same place, but HADOOP_HOME is deprecated, so HADOOP_PREFIX was 
born.  NOTE: like was stated before all of these are supposed to be hidden from 
the end user and are intended more towards packaging and deploying hadoop.  
Also the process is not done and it is likely to change further.

--Bobby Evans

On 2/1/12 8:10 AM, "praveenesh kumar"  wrote:

Interesting and strange.
but are there any reason for setting $HADOOP_HOME to $HADOOP_PREFIX in
hadoop-conf.sh
and then checking in /bin/hadoop.sh whether $HADOOP_HOME is not equal to ""

I mean if I comment out the export HADOOP_HOME=${HADOOP_PREFIX} in
hadoop-conf.sh, does it make any difference ?

Thanks,
Praveenesh

On Wed, Feb 1, 2012 at 6:04 PM, Prashant Sharma wrote:

> I think you have misunderstood something. AFAIK or understand  these
> variables are set automatically when you run a script. it's name is obscure
> for some strange reason. ;).
>
> Warning: $HADOOP_HOME is deprecated is always there. whether the variable
> is set or not. Why?
> Because the hadoop-config is sourced in all scripts. And all it does is
> sets HADOOP_PREFIX as HADOOP_HOME. I think this can be reported as a bug.
>
> -P
>
>
> On Wed, Feb 1, 2012 at 5:46 PM, praveenesh kumar  >wrote:
>
> > Does anyone have idea on Why $HADOOP_PREFIX was introduced instead of
> > $HADOOP_HOME in hadoop 0.20.205 ?
> >
> > I believe $HADOOP_HOME was not giving any troubles or is there a
> reason/new
> > feature that require $HADOOP_PREFIX to be added ?
> >
> > Its a kind of funny, but I got habitual of using $HADOOP_HOME. Just
> curious
> > to know for this change.
> > Also, there are some old packages ( I am not referring apache/cloudera/or
> > any hadoop distribution ), that depends on hadoop that still uses
> > $HADOOP_HOME inside. So its kind of weird when you use those packages,
> you
> > still get warning messages even though its suppressed from Hadoop side.
> >
> >
> > Thanks,
> > Praveenesh
> >
>



Re: Why $HADOOP_PREFIX ?

2012-02-01 Thread Harsh J
Personal opinion here: For branch-1, I do think the earlier tarball
structure was better. I do not see why it had to change for this
version at least. Possibly was changed during all the work of adding
packaging-related scripts for rpm/deb into Hadoop itself, but the
tarball right now is not as usable as was before, and the older format
would've still worked today.

On Wed, Feb 1, 2012 at 10:31 PM, Robert Evans  wrote:
> I think it comes down to a long history of splitting and then remerging the 
> hadoop project.  I could be wrong about a lot of this so take it worth a 
> grain of salt.  Hadoop originally, and still is on 1.0 a single project.  
> HDFS, mapreduce and common are all compiled together into a single jar 
> hadoop-core.  In that respect HADOOP_HOME made a lot of since because it was 
> a single thing, with some dependencies that needed to be found by some shell 
> scripts.
>
> Fast forward the projects were split, HADOOP_HOME was deprecated, and 
> HADOOP_COMMON_HOME, HADOOP_MAPRED_HOME, and HADOOP_HDFS_HOME were born.  But 
> if we install them all into a single tree it is a pain to configure all of 
> these to point to the same place, but HADOOP_HOME is deprecated, so 
> HADOOP_PREFIX was born.  NOTE: like was stated before all of these are 
> supposed to be hidden from the end user and are intended more towards 
> packaging and deploying hadoop.  Also the process is not done and it is 
> likely to change further.
>
> --Bobby Evans
>
> On 2/1/12 8:10 AM, "praveenesh kumar"  wrote:
>
> Interesting and strange.
> but are there any reason for setting $HADOOP_HOME to $HADOOP_PREFIX in
> hadoop-conf.sh
> and then checking in /bin/hadoop.sh whether $HADOOP_HOME is not equal to ""
>
> I mean if I comment out the export HADOOP_HOME=${HADOOP_PREFIX} in
> hadoop-conf.sh, does it make any difference ?
>
> Thanks,
> Praveenesh
>
> On Wed, Feb 1, 2012 at 6:04 PM, Prashant Sharma 
> wrote:
>
>> I think you have misunderstood something. AFAIK or understand  these
>> variables are set automatically when you run a script. it's name is obscure
>> for some strange reason. ;).
>>
>> Warning: $HADOOP_HOME is deprecated is always there. whether the variable
>> is set or not. Why?
>> Because the hadoop-config is sourced in all scripts. And all it does is
>> sets HADOOP_PREFIX as HADOOP_HOME. I think this can be reported as a bug.
>>
>> -P
>>
>>
>> On Wed, Feb 1, 2012 at 5:46 PM, praveenesh kumar > >wrote:
>>
>> > Does anyone have idea on Why $HADOOP_PREFIX was introduced instead of
>> > $HADOOP_HOME in hadoop 0.20.205 ?
>> >
>> > I believe $HADOOP_HOME was not giving any troubles or is there a
>> reason/new
>> > feature that require $HADOOP_PREFIX to be added ?
>> >
>> > Its a kind of funny, but I got habitual of using $HADOOP_HOME. Just
>> curious
>> > to know for this change.
>> > Also, there are some old packages ( I am not referring apache/cloudera/or
>> > any hadoop distribution ), that depends on hadoop that still uses
>> > $HADOOP_HOME inside. So its kind of weird when you use those packages,
>> you
>> > still get warning messages even though its suppressed from Hadoop side.
>> >
>> >
>> > Thanks,
>> > Praveenesh
>> >
>>
>



-- 
Harsh J
Customer Ops. Engineer
Cloudera | http://tiny.cloudera.com/about


Re: Why $HADOOP_PREFIX ?

2012-02-01 Thread Prashant Sharma
@Harsh, I sometimes get similar thoughts :P. But wonder if there is
something can be done.

@Bobby, Thanks for elaborating the strange reason. :)

@Praveenesh, Yes, you can do away with sourcing of hadoop-config.sh and set
all the necessary variables by hand.


On Wed, Feb 1, 2012 at 10:38 PM, Harsh J  wrote:

> Personal opinion here: For branch-1, I do think the earlier tarball
> structure was better. I do not see why it had to change for this
> version at least. Possibly was changed during all the work of adding
> packaging-related scripts for rpm/deb into Hadoop itself, but the
> tarball right now is not as usable as was before, and the older format
> would've still worked today.
>
> On Wed, Feb 1, 2012 at 10:31 PM, Robert Evans  wrote:
> > I think it comes down to a long history of splitting and then remerging
> the hadoop project.  I could be wrong about a lot of this so take it worth
> a grain of salt.  Hadoop originally, and still is on 1.0 a single project.
>  HDFS, mapreduce and common are all compiled together into a single jar
> hadoop-core.  In that respect HADOOP_HOME made a lot of since because it
> was a single thing, with some dependencies that needed to be found by some
> shell scripts.
> >
> > Fast forward the projects were split, HADOOP_HOME was deprecated, and
> HADOOP_COMMON_HOME, HADOOP_MAPRED_HOME, and HADOOP_HDFS_HOME were born.
>  But if we install them all into a single tree it is a pain to configure
> all of these to point to the same place, but HADOOP_HOME is deprecated, so
> HADOOP_PREFIX was born.  NOTE: like was stated before all of these are
> supposed to be hidden from the end user and are intended more towards
> packaging and deploying hadoop.  Also the process is not done and it is
> likely to change further.
> >
> > --Bobby Evans
> >
> > On 2/1/12 8:10 AM, "praveenesh kumar"  wrote:
> >
> > Interesting and strange.
> > but are there any reason for setting $HADOOP_HOME to $HADOOP_PREFIX in
> > hadoop-conf.sh
> > and then checking in /bin/hadoop.sh whether $HADOOP_HOME is not equal to
> ""
> >
> > I mean if I comment out the export HADOOP_HOME=${HADOOP_PREFIX} in
> > hadoop-conf.sh, does it make any difference ?
> >
> > Thanks,
> > Praveenesh
> >
> > On Wed, Feb 1, 2012 at 6:04 PM, Prashant Sharma  >wrote:
> >
> >> I think you have misunderstood something. AFAIK or understand  these
> >> variables are set automatically when you run a script. it's name is
> obscure
> >> for some strange reason. ;).
> >>
> >> Warning: $HADOOP_HOME is deprecated is always there. whether the
> variable
> >> is set or not. Why?
> >> Because the hadoop-config is sourced in all scripts. And all it does is
> >> sets HADOOP_PREFIX as HADOOP_HOME. I think this can be reported as a
> bug.
> >>
> >> -P
> >>
> >>
> >> On Wed, Feb 1, 2012 at 5:46 PM, praveenesh kumar  >> >wrote:
> >>
> >> > Does anyone have idea on Why $HADOOP_PREFIX was introduced instead of
> >> > $HADOOP_HOME in hadoop 0.20.205 ?
> >> >
> >> > I believe $HADOOP_HOME was not giving any troubles or is there a
> >> reason/new
> >> > feature that require $HADOOP_PREFIX to be added ?
> >> >
> >> > Its a kind of funny, but I got habitual of using $HADOOP_HOME. Just
> >> curious
> >> > to know for this change.
> >> > Also, there are some old packages ( I am not referring
> apache/cloudera/or
> >> > any hadoop distribution ), that depends on hadoop that still uses
> >> > $HADOOP_HOME inside. So its kind of weird when you use those packages,
> >> you
> >> > still get warning messages even though its suppressed from Hadoop
> side.
> >> >
> >> >
> >> > Thanks,
> >> > Praveenesh
> >> >
> >>
> >
>
>
>
> --
> Harsh J
> Customer Ops. Engineer
> Cloudera | http://tiny.cloudera.com/about
>