Re: Can I install two different version of hadoop in the same cluster ?

2009-10-30 Thread Amandeep Khurana
Theoretically, you should be able to do this. You'll need to alter all the ports so that there are no conflicts. However, resources might be a problem... (assuming that you want daemons of both versions running at the same time).. If you just want to run one version at a time, then it should not

Re: Can I install two different version of hadoop in the same cluster ?

2009-10-30 Thread Aaron Kimball
Also hadoop.tmp.dir and mapred.local.dir in your xml configuration, and the environment variables HADOOP_LOG_DIR and HADOOP_PID_DIR in hadoop-env.sh. - Aaron On Thu, Oct 29, 2009 at 10:44 PM, Jeff Zhang zjf...@gmail.com wrote: Hi all, I have installed hadoop 0.18.3 on my own cluster with 5

EC2 do not support hadoop 0.18.3

2009-10-30 Thread Jeff Zhang
Hi all, I found the the default value of HADOOP_VERSION is 0.17.0 in hadoop-ec2-env.sh of hadoop 0.18.3, and I can create hadoop 0.17.0 cluster in ec2 succesully, but I can not create hadoop 0.18.3 if I change the HADOOP_VERSION to 0.18.3. Besides, the HADOOP_VERSION in hadoop-ec2-env.sh in

question on data/task node specific configuration...

2009-10-30 Thread Andy Sautins
I've run into a situation where it would be helpful to set specific configuration variables local to a data/task node. I've got a solution, but I'm curious if there is a best practice around this and if I'm doing it in a reasonable way. Basically what we've got is a number of