Yes.
St.Ack

R. James Firby wrote:
Yes, I guess that would work.  We actually split our hadoop configs into two 
directories, one for the dfs and one for map/reduce so we can run multiple 
map/reduce clusters on the same set of machines that are all running the dfs.

So, we could just set HADOOP_CONF_DIR in hbase-env.sh to point to the dfs 
config directory and all would be well.  Right?

Jim


On 4/11/08 1:43 PM, "stack" <[EMAIL PROTECTED]> wrote:

R. James Firby wrote:
Let's be careful with this one.

We run hadoop with non-default configs that live in a different directory from 
where hbase and its configs are installed.   For us, a really nice thing would 
be to have a value we can set in the hbase config that would point to the 
proper hadoop config to use.


Does adding wherever HADOOP_CONF_DIR points to the HBASE_CLASSPATH in
hbase-env.sh work for you Jim?   Or would you like something else?
Thanks,
St.Ack




Reply via email to