[jira] [Commented] (HBASE-3465) Hbase should use a HADOOP_HOME environment variable if available.

2011-12-13 Thread Joern Kottmann (Commented) (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-3465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13168755#comment-13168755
 ] 

Joern Kottmann commented on HBASE-3465:
---

Sounds like that will fix the issues, but I don't know if it will have any 
other consequences.

I more or less did set the HADOOP_HOME and other environment variables 
accidentally and just really wonder why that disables the ganglia monitoring. 
As far as I understand that is only needed for features I do not use anyway. 

Maybe Ted or Alejandro can comment here if the suggested fix would cause any 
issues. 

 Hbase should use a HADOOP_HOME environment variable if available.
 -

 Key: HBASE-3465
 URL: https://issues.apache.org/jira/browse/HBASE-3465
 Project: HBase
  Issue Type: Bug
Affects Versions: 0.90.0
Reporter: Ted Dunning
Assignee: Alejandro Abdelnur
 Fix For: 0.92.0

 Attachments: a1-HBASE-3465.patch


 I have been burned a few times lately while developing code by having the 
 make sure that the hadoop jar in hbase/lib is exactly correct.  In my own 
 deployment, there are actually 3 jars and a native library to keep in sync 
 that hbase shouldn't have to know about explicitly.  A similar problem arises 
 when using stock hbase with CDH3 because of the security patches changing the 
 wire protocol.
 All of these problems could be avoided by not assuming that the hadoop 
 library is in the local directory.  Moreover, I think it might be possible to 
 assemble the distribution such that the compile time hadoop dependency is in 
 a cognate directory to lib and is referenced using a default value for 
 HADOOP_HOME.
 Does anybody have any violent antipathies to such a change?

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Commented] (HBASE-3465) Hbase should use a HADOOP_HOME environment variable if available.

2011-12-13 Thread Harsh J (Commented) (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-3465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13168762#comment-13168762
 ] 

Harsh J commented on HBASE-3465:


HBASE-4854 fixed this I think?

 Hbase should use a HADOOP_HOME environment variable if available.
 -

 Key: HBASE-3465
 URL: https://issues.apache.org/jira/browse/HBASE-3465
 Project: HBase
  Issue Type: Bug
Affects Versions: 0.90.0
Reporter: Ted Dunning
Assignee: Alejandro Abdelnur
 Fix For: 0.92.0

 Attachments: a1-HBASE-3465.patch


 I have been burned a few times lately while developing code by having the 
 make sure that the hadoop jar in hbase/lib is exactly correct.  In my own 
 deployment, there are actually 3 jars and a native library to keep in sync 
 that hbase shouldn't have to know about explicitly.  A similar problem arises 
 when using stock hbase with CDH3 because of the security patches changing the 
 wire protocol.
 All of these problems could be avoided by not assuming that the hadoop 
 library is in the local directory.  Moreover, I think it might be possible to 
 assemble the distribution such that the compile time hadoop dependency is in 
 a cognate directory to lib and is referenced using a default value for 
 HADOOP_HOME.
 Does anybody have any violent antipathies to such a change?

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Commented] (HBASE-3465) Hbase should use a HADOOP_HOME environment variable if available.

2011-12-12 Thread Joern Kottmann (Commented) (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-3465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13167728#comment-13167728
 ] 

Joern Kottmann commented on HBASE-3465:
---

Looks like the Hadoop conf dir is now before the Hbase conf dir on the 
classpath in case the HADOOP_HOME variable is set.

Isn't that problematic for configuration files which exist in the Hadoop and 
Hbase conf directory?

Such as:
hadoop-metrics.properties
log4j.properties

In the case HADOOP_HOME is set the files from Hadoop are used instead, which 
might be confusing.

 Hbase should use a HADOOP_HOME environment variable if available.
 -

 Key: HBASE-3465
 URL: https://issues.apache.org/jira/browse/HBASE-3465
 Project: HBase
  Issue Type: Bug
Affects Versions: 0.90.0
Reporter: Ted Dunning
Assignee: Alejandro Abdelnur
 Fix For: 0.92.0

 Attachments: a1-HBASE-3465.patch


 I have been burned a few times lately while developing code by having the 
 make sure that the hadoop jar in hbase/lib is exactly correct.  In my own 
 deployment, there are actually 3 jars and a native library to keep in sync 
 that hbase shouldn't have to know about explicitly.  A similar problem arises 
 when using stock hbase with CDH3 because of the security patches changing the 
 wire protocol.
 All of these problems could be avoided by not assuming that the hadoop 
 library is in the local directory.  Moreover, I think it might be possible to 
 assemble the distribution such that the compile time hadoop dependency is in 
 a cognate directory to lib and is referenced using a default value for 
 HADOOP_HOME.
 Does anybody have any violent antipathies to such a change?

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Commented] (HBASE-3465) Hbase should use a HADOOP_HOME environment variable if available.

2011-12-12 Thread stack (Commented) (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-3465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13167754#comment-13167754
 ] 

stack commented on HBASE-3465:
--

That sounds like a bug yes Joern.

 Hbase should use a HADOOP_HOME environment variable if available.
 -

 Key: HBASE-3465
 URL: https://issues.apache.org/jira/browse/HBASE-3465
 Project: HBase
  Issue Type: Bug
Affects Versions: 0.90.0
Reporter: Ted Dunning
Assignee: Alejandro Abdelnur
 Fix For: 0.92.0

 Attachments: a1-HBASE-3465.patch


 I have been burned a few times lately while developing code by having the 
 make sure that the hadoop jar in hbase/lib is exactly correct.  In my own 
 deployment, there are actually 3 jars and a native library to keep in sync 
 that hbase shouldn't have to know about explicitly.  A similar problem arises 
 when using stock hbase with CDH3 because of the security patches changing the 
 wire protocol.
 All of these problems could be avoided by not assuming that the hadoop 
 library is in the local directory.  Moreover, I think it might be possible to 
 assemble the distribution such that the compile time hadoop dependency is in 
 a cognate directory to lib and is referenced using a default value for 
 HADOOP_HOME.
 Does anybody have any violent antipathies to such a change?

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Commented] (HBASE-3465) Hbase should use a HADOOP_HOME environment variable if available.

2011-12-12 Thread stack (Commented) (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-3465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13167782#comment-13167782
 ] 

stack commented on HBASE-3465:
--

We should swap the order in which we append the CLASSPATH so hbase additions 
come first?


 Hbase should use a HADOOP_HOME environment variable if available.
 -

 Key: HBASE-3465
 URL: https://issues.apache.org/jira/browse/HBASE-3465
 Project: HBase
  Issue Type: Bug
Affects Versions: 0.90.0
Reporter: Ted Dunning
Assignee: Alejandro Abdelnur
 Fix For: 0.92.0

 Attachments: a1-HBASE-3465.patch


 I have been burned a few times lately while developing code by having the 
 make sure that the hadoop jar in hbase/lib is exactly correct.  In my own 
 deployment, there are actually 3 jars and a native library to keep in sync 
 that hbase shouldn't have to know about explicitly.  A similar problem arises 
 when using stock hbase with CDH3 because of the security patches changing the 
 wire protocol.
 All of these problems could be avoided by not assuming that the hadoop 
 library is in the local directory.  Moreover, I think it might be possible to 
 assemble the distribution such that the compile time hadoop dependency is in 
 a cognate directory to lib and is referenced using a default value for 
 HADOOP_HOME.
 Does anybody have any violent antipathies to such a change?

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: 
https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Commented] (HBASE-3465) Hbase should use a HADOOP_HOME environment variable if available.

2011-07-21 Thread stack (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-3465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13068827#comment-13068827
 ] 

stack commented on HBASE-3465:
--

Looks great AA.

Should we include the test jar?

{code}
+  HADOOPCPPATH=$(append_path ${HADOOPCPPATH} `ls 
${HADOOP_HOME}/hadoop-core*.jar`)
{code}

Is this right?

{code}
+if [ -d ${HBASE_HOME}/build/native -o -d ${HBASE_HOME}/lib/native ]; then
{code}

Is it same thing on left and right side of the -o?

Does it work?  If you set HADOOP_HOME, do we use its hadoop and jars over our 
own?

Good stuff.

 Hbase should use a HADOOP_HOME environment variable if available.
 -

 Key: HBASE-3465
 URL: https://issues.apache.org/jira/browse/HBASE-3465
 Project: HBase
  Issue Type: Bug
Affects Versions: 0.90.0
Reporter: Ted Dunning
Assignee: Alejandro Abdelnur
 Attachments: a1-HBASE-3465.patch


 I have been burned a few times lately while developing code by having the 
 make sure that the hadoop jar in hbase/lib is exactly correct.  In my own 
 deployment, there are actually 3 jars and a native library to keep in sync 
 that hbase shouldn't have to know about explicitly.  A similar problem arises 
 when using stock hbase with CDH3 because of the security patches changing the 
 wire protocol.
 All of these problems could be avoided by not assuming that the hadoop 
 library is in the local directory.  Moreover, I think it might be possible to 
 assemble the distribution such that the compile time hadoop dependency is in 
 a cognate directory to lib and is referenced using a default value for 
 HADOOP_HOME.
 Does anybody have any violent antipathies to such a change?

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Commented] (HBASE-3465) Hbase should use a HADOOP_HOME environment variable if available.

2011-07-21 Thread Alejandro Abdelnur (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-3465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13068978#comment-13068978
 ] 

Alejandro Abdelnur commented on HBASE-3465:
---

Thanks Stack.

On 1st comment: Does Hbase require Hadoop tests JARs? If so we should not only 
include it. But the problem we'll have there is its transitive dependencies 
(which are not in the Hadoop's lib).

On 2nd comment: left are right are not the same, *build* and *lib*. BTW, this 
is in the original script, it shows modified (the - is before all the +).

 Hbase should use a HADOOP_HOME environment variable if available.
 -

 Key: HBASE-3465
 URL: https://issues.apache.org/jira/browse/HBASE-3465
 Project: HBase
  Issue Type: Bug
Affects Versions: 0.90.0
Reporter: Ted Dunning
Assignee: Alejandro Abdelnur
 Attachments: a1-HBASE-3465.patch


 I have been burned a few times lately while developing code by having the 
 make sure that the hadoop jar in hbase/lib is exactly correct.  In my own 
 deployment, there are actually 3 jars and a native library to keep in sync 
 that hbase shouldn't have to know about explicitly.  A similar problem arises 
 when using stock hbase with CDH3 because of the security patches changing the 
 wire protocol.
 All of these problems could be avoided by not assuming that the hadoop 
 library is in the local directory.  Moreover, I think it might be possible to 
 assemble the distribution such that the compile time hadoop dependency is in 
 a cognate directory to lib and is referenced using a default value for 
 HADOOP_HOME.
 Does anybody have any violent antipathies to such a change?

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Commented] (HBASE-3465) Hbase should use a HADOOP_HOME environment variable if available.

2011-07-21 Thread stack (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-3465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13069025#comment-13069025
 ] 

stack commented on HBASE-3465:
--

On 1., we use the mini test tools, minidfs, etc., but if the HADOOP_HOME is for 
runtime only, we should be able to get away with not needing hadoop-test*.jar.

On 2., my bad.

This patch should be good to go then AA?

 Hbase should use a HADOOP_HOME environment variable if available.
 -

 Key: HBASE-3465
 URL: https://issues.apache.org/jira/browse/HBASE-3465
 Project: HBase
  Issue Type: Bug
Affects Versions: 0.90.0
Reporter: Ted Dunning
Assignee: Alejandro Abdelnur
 Attachments: a1-HBASE-3465.patch


 I have been burned a few times lately while developing code by having the 
 make sure that the hadoop jar in hbase/lib is exactly correct.  In my own 
 deployment, there are actually 3 jars and a native library to keep in sync 
 that hbase shouldn't have to know about explicitly.  A similar problem arises 
 when using stock hbase with CDH3 because of the security patches changing the 
 wire protocol.
 All of these problems could be avoided by not assuming that the hadoop 
 library is in the local directory.  Moreover, I think it might be possible to 
 assemble the distribution such that the compile time hadoop dependency is in 
 a cognate directory to lib and is referenced using a default value for 
 HADOOP_HOME.
 Does anybody have any violent antipathies to such a change?

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Commented] (HBASE-3465) Hbase should use a HADOOP_HOME environment variable if available.

2011-07-21 Thread stack (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-3465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13069055#comment-13069055
 ] 

stack commented on HBASE-3465:
--

I'm +1 then.  Let me commit in a day or so.  Maybe Ted Dunning will grace us w/ 
an opinion if we give him a chance.  Good on you AA.

 Hbase should use a HADOOP_HOME environment variable if available.
 -

 Key: HBASE-3465
 URL: https://issues.apache.org/jira/browse/HBASE-3465
 Project: HBase
  Issue Type: Bug
Affects Versions: 0.90.0
Reporter: Ted Dunning
Assignee: Alejandro Abdelnur
 Attachments: a1-HBASE-3465.patch


 I have been burned a few times lately while developing code by having the 
 make sure that the hadoop jar in hbase/lib is exactly correct.  In my own 
 deployment, there are actually 3 jars and a native library to keep in sync 
 that hbase shouldn't have to know about explicitly.  A similar problem arises 
 when using stock hbase with CDH3 because of the security patches changing the 
 wire protocol.
 All of these problems could be avoided by not assuming that the hadoop 
 library is in the local directory.  Moreover, I think it might be possible to 
 assemble the distribution such that the compile time hadoop dependency is in 
 a cognate directory to lib and is referenced using a default value for 
 HADOOP_HOME.
 Does anybody have any violent antipathies to such a change?

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Commented] (HBASE-3465) Hbase should use a HADOOP_HOME environment variable if available.

2011-07-13 Thread stack (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-3465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13064725#comment-13064725
 ] 

stack commented on HBASE-3465:
--

Alejandro:

So, whats in the hbase tar ball?  Do we include an hadoop jar?  We need to, 
don't we, so that we can run standalone mode.

If HADOOP_HOME is defined and if hbase/lib dir has hadoop*jars, we'll need to 
do some pruning of the CLASSPATH.  Currently we include hbase/lib/*... We'd 
have to do a little messing to exclude hadoop-*.jars. That'd be easy enough.

What you reckon @Ted?

I can help out Alejandro.  I volunteer to do the doc portion of this issue.

 Hbase should use a HADOOP_HOME environment variable if available.
 -

 Key: HBASE-3465
 URL: https://issues.apache.org/jira/browse/HBASE-3465
 Project: HBase
  Issue Type: Bug
Affects Versions: 0.90.0
Reporter: Ted Dunning

 I have been burned a few times lately while developing code by having the 
 make sure that the hadoop jar in hbase/lib is exactly correct.  In my own 
 deployment, there are actually 3 jars and a native library to keep in sync 
 that hbase shouldn't have to know about explicitly.  A similar problem arises 
 when using stock hbase with CDH3 because of the security patches changing the 
 wire protocol.
 All of these problems could be avoided by not assuming that the hadoop 
 library is in the local directory.  Moreover, I think it might be possible to 
 assemble the distribution such that the compile time hadoop dependency is in 
 a cognate directory to lib and is referenced using a default value for 
 HADOOP_HOME.
 Does anybody have any violent antipathies to such a change?

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] [Commented] (HBASE-3465) Hbase should use a HADOOP_HOME environment variable if available.

2011-07-12 Thread Alejandro Abdelnur (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-3465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=13064032#comment-13064032
 ] 

Alejandro Abdelnur commented on HBASE-3465:
---

Last year I've open a JIRA for Hadoop (HADOOP-6894) proposing a general 
solution for this.

Until there is a general mechanism Hbase should do something. 

IMO Hbase should not duplicate/copy Hadoop JARs in its setup. It should just 
use HADOOP_HOME if defined, to bring in JAR and SO files. And if HADOOP_HOME is 
not set it assumes all Hadoop JAR and SO files have been copied into Hbase lib 
and lib/native/.

This seems simple enough and it should work.

Thoughts?

 Hbase should use a HADOOP_HOME environment variable if available.
 -

 Key: HBASE-3465
 URL: https://issues.apache.org/jira/browse/HBASE-3465
 Project: HBase
  Issue Type: Bug
Affects Versions: 0.90.0
Reporter: Ted Dunning

 I have been burned a few times lately while developing code by having the 
 make sure that the hadoop jar in hbase/lib is exactly correct.  In my own 
 deployment, there are actually 3 jars and a native library to keep in sync 
 that hbase shouldn't have to know about explicitly.  A similar problem arises 
 when using stock hbase with CDH3 because of the security patches changing the 
 wire protocol.
 All of these problems could be avoided by not assuming that the hadoop 
 library is in the local directory.  Moreover, I think it might be possible to 
 assemble the distribution such that the compile time hadoop dependency is in 
 a cognate directory to lib and is referenced using a default value for 
 HADOOP_HOME.
 Does anybody have any violent antipathies to such a change?

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Commented: (HBASE-3465) Hbase should use a HADOOP_HOME environment variable if available.

2011-02-08 Thread stack (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-3465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=12992073#comment-12992073
 ] 

stack commented on HBASE-3465:
--

bq. I am not suggesting a new hadoop lib dir. I am suggesting that hbase assume 
that it will use an existing hadoop lib dir from an existing hadoop install.

There is not always an existing Hadoop install.  In particular if you run HBase 
standalone, it comes up fine without reference to an Hadoop cluster.

Tell me more about your hbase-with-dependencies notion Ted.  I don't grok it 
too well.

I appreciate your bringing this up.  We've been operating this way for ages but 
it has never been satisfactory.  Appreciate fresh POV.

 Hbase should use a HADOOP_HOME environment variable if available.
 -

 Key: HBASE-3465
 URL: https://issues.apache.org/jira/browse/HBASE-3465
 Project: HBase
  Issue Type: Bug
Affects Versions: 0.90.0
Reporter: Ted Dunning
 Fix For: 0.90.1, 0.92.0


 I have been burned a few times lately while developing code by having the 
 make sure that the hadoop jar in hbase/lib is exactly correct.  In my own 
 deployment, there are actually 3 jars and a native library to keep in sync 
 that hbase shouldn't have to know about explicitly.  A similar problem arises 
 when using stock hbase with CDH3 because of the security patches changing the 
 wire protocol.
 All of these problems could be avoided by not assuming that the hadoop 
 library is in the local directory.  Moreover, I think it might be possible to 
 assemble the distribution such that the compile time hadoop dependency is in 
 a cognate directory to lib and is referenced using a default value for 
 HADOOP_HOME.
 Does anybody have any violent antipathies to such a change?

-- 
This message is automatically generated by JIRA.
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Commented: (HBASE-3465) Hbase should use a HADOOP_HOME environment variable if available.

2011-02-08 Thread Ted Dunning (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-3465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=12992080#comment-12992080
 ] 

Ted Dunning commented on HBASE-3465:


My suggestion is that there be an hbase jar that contains all dependencies 
except for Hadoop.  In addition, there would be a lib dir with a default 
hadoop.jar that would satisfy all of transitive dependencies that are implied 
by Hadoop.

This would allow the startup script (bin/hbase) to detect whether the user has 
provided a hadoop home directory or not.  The script could then build a class 
path that either has the user specified hadoop or the default hadoop jar.

From the user's point of view, hbase would generally just work.  For somebody 
working with a different and API compatible but wire-level incompatible Hadoop 
such as Hadoop with security, one environment variable setting will cure the 
problem.


 Hbase should use a HADOOP_HOME environment variable if available.
 -

 Key: HBASE-3465
 URL: https://issues.apache.org/jira/browse/HBASE-3465
 Project: HBase
  Issue Type: Bug
Affects Versions: 0.90.0
Reporter: Ted Dunning
 Fix For: 0.90.1, 0.92.0


 I have been burned a few times lately while developing code by having the 
 make sure that the hadoop jar in hbase/lib is exactly correct.  In my own 
 deployment, there are actually 3 jars and a native library to keep in sync 
 that hbase shouldn't have to know about explicitly.  A similar problem arises 
 when using stock hbase with CDH3 because of the security patches changing the 
 wire protocol.
 All of these problems could be avoided by not assuming that the hadoop 
 library is in the local directory.  Moreover, I think it might be possible to 
 assemble the distribution such that the compile time hadoop dependency is in 
 a cognate directory to lib and is referenced using a default value for 
 HADOOP_HOME.
 Does anybody have any violent antipathies to such a change?

-- 
This message is automatically generated by JIRA.
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Commented: (HBASE-3465) Hbase should use a HADOOP_HOME environment variable if available.

2011-02-08 Thread ryan rawson (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-3465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=12992248#comment-12992248
 ] 

ryan rawson commented on HBASE-3465:


we should consider using jarjar:

http://code.google.com/p/jarjar/

That way we would not have conflicting jar issues with our classpath.


 Hbase should use a HADOOP_HOME environment variable if available.
 -

 Key: HBASE-3465
 URL: https://issues.apache.org/jira/browse/HBASE-3465
 Project: HBase
  Issue Type: Bug
Affects Versions: 0.90.0
Reporter: Ted Dunning
 Fix For: 0.90.1, 0.92.0


 I have been burned a few times lately while developing code by having the 
 make sure that the hadoop jar in hbase/lib is exactly correct.  In my own 
 deployment, there are actually 3 jars and a native library to keep in sync 
 that hbase shouldn't have to know about explicitly.  A similar problem arises 
 when using stock hbase with CDH3 because of the security patches changing the 
 wire protocol.
 All of these problems could be avoided by not assuming that the hadoop 
 library is in the local directory.  Moreover, I think it might be possible to 
 assemble the distribution such that the compile time hadoop dependency is in 
 a cognate directory to lib and is referenced using a default value for 
 HADOOP_HOME.
 Does anybody have any violent antipathies to such a change?

-- 
This message is automatically generated by JIRA.
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Commented: (HBASE-3465) Hbase should use a HADOOP_HOME environment variable if available.

2011-02-08 Thread Ted Dunning (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-3465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=12992269#comment-12992269
 ] 

Ted Dunning commented on HBASE-3465:


The problem I had was not a matter of conflicting jars so much as that I wanted 
to have an hbase build that would accept my hadoop and related jars and not 
include its own unless I didn't provide something.

Maven makes this easy by having a provided scope for dependencies.  The only 
missing bit is some shell-fu to either include a default set of hadoop jars or 
env specified hadoop jars as desired.

I will produce a prototype when I have time to scratch.


 Hbase should use a HADOOP_HOME environment variable if available.
 -

 Key: HBASE-3465
 URL: https://issues.apache.org/jira/browse/HBASE-3465
 Project: HBase
  Issue Type: Bug
Affects Versions: 0.90.0
Reporter: Ted Dunning
 Fix For: 0.90.1, 0.92.0


 I have been burned a few times lately while developing code by having the 
 make sure that the hadoop jar in hbase/lib is exactly correct.  In my own 
 deployment, there are actually 3 jars and a native library to keep in sync 
 that hbase shouldn't have to know about explicitly.  A similar problem arises 
 when using stock hbase with CDH3 because of the security patches changing the 
 wire protocol.
 All of these problems could be avoided by not assuming that the hadoop 
 library is in the local directory.  Moreover, I think it might be possible to 
 assemble the distribution such that the compile time hadoop dependency is in 
 a cognate directory to lib and is referenced using a default value for 
 HADOOP_HOME.
 Does anybody have any violent antipathies to such a change?

-- 
This message is automatically generated by JIRA.
-
For more information on JIRA, see: http://www.atlassian.com/software/jira




[jira] Commented: (HBASE-3465) Hbase should use a HADOOP_HOME environment variable if available.

2011-01-26 Thread stack (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-3465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=12987327#action_12987327
 ] 

stack commented on HBASE-3465:
--

I'm not mad about having new hadoop only lib dir.  What if hadoop were in a 
subdir of lib?  Then you could do your HADOOP_HOME into lib/hadoop as default 
and if overridden, we use the alternative.

But I'm kinda easy on this.  Currently its a bit of a pain, especially so with 
0.90.0, given as we'll more than likely be deployed atop an Hadoop we are not 
compatible with.

 Hbase should use a HADOOP_HOME environment variable if available.
 -

 Key: HBASE-3465
 URL: https://issues.apache.org/jira/browse/HBASE-3465
 Project: HBase
  Issue Type: Bug
Affects Versions: 0.90.0
Reporter: Ted Dunning
 Fix For: 0.90.1, 0.92.0


 I have been burned a few times lately while developing code by having the 
 make sure that the hadoop jar in hbase/lib is exactly correct.  In my own 
 deployment, there are actually 3 jars and a native library to keep in sync 
 that hbase shouldn't have to know about explicitly.  A similar problem arises 
 when using stock hbase with CDH3 because of the security patches changing the 
 wire protocol.
 All of these problems could be avoided by not assuming that the hadoop 
 library is in the local directory.  Moreover, I think it might be possible to 
 assemble the distribution such that the compile time hadoop dependency is in 
 a cognate directory to lib and is referenced using a default value for 
 HADOOP_HOME.
 Does anybody have any violent antipathies to such a change?

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.



[jira] Commented: (HBASE-3465) Hbase should use a HADOOP_HOME environment variable if available.

2011-01-26 Thread Ted Dunning (JIRA)

[ 
https://issues.apache.org/jira/browse/HBASE-3465?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=12987342#action_12987342
 ] 

Ted Dunning commented on HBASE-3465:



I am not suggesting a new hadoop lib dir.  I am suggesting that hbase assume 
that it will use an existing hadoop lib dir from an existing hadoop install.

Your suggestion of supplying a default lib directory is interesting since it 
would make hbase self contained by default (though still delicate because of 
version issues).

My major difficulty with the lib/hadoop/*.jar and HADOOP_HOME=lib/hadoop 
default is that it makes the build significantly fancier.  If somebody has a 
cool and simple way to stick dependencies in the provided scope into a 
different directory while still sticking the run dependencies into lib, then I 
would have no objection.  Sounds to me like it would require a custom assembly 
at the least, but that isn't so hard.

In any case, it still seems that making an hbase-with-dependencies jar would be 
good to do.  In my estimation, this should not contain the hadoop jars, but 
should assume the same trick.  It would presumably be relatively easy to pour 
all of the hadoop dependencies into a single hadoop-with-dependencies if it is 
easy to select all provided scope dependencies.  That would give us two jars 
and one hadoop lib directory in the end that provide a lot of flexibility.


 Hbase should use a HADOOP_HOME environment variable if available.
 -

 Key: HBASE-3465
 URL: https://issues.apache.org/jira/browse/HBASE-3465
 Project: HBase
  Issue Type: Bug
Affects Versions: 0.90.0
Reporter: Ted Dunning
 Fix For: 0.90.1, 0.92.0


 I have been burned a few times lately while developing code by having the 
 make sure that the hadoop jar in hbase/lib is exactly correct.  In my own 
 deployment, there are actually 3 jars and a native library to keep in sync 
 that hbase shouldn't have to know about explicitly.  A similar problem arises 
 when using stock hbase with CDH3 because of the security patches changing the 
 wire protocol.
 All of these problems could be avoided by not assuming that the hadoop 
 library is in the local directory.  Moreover, I think it might be possible to 
 assemble the distribution such that the compile time hadoop dependency is in 
 a cognate directory to lib and is referenced using a default value for 
 HADOOP_HOME.
 Does anybody have any violent antipathies to such a change?

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.