Hi Boyu, have you tried not to use the package deploying and use the extracted folder on each machine instead? Because I'm using this method and works fine. Song Liu
在2010-03-24 02:43:44,"Boyu Zhang" <boyuzhan...@gmail.com> 写道: >Thanks for the tip, I use -c to specify hod-conf-dir in the command, and I >set both the two java-home, I still get the same error. I will keep looking >and let you know. > >Boyu > >On Tue, Mar 23, 2010 at 11:09 AM, Antonio D'Ettole <coda...@gmail.com>wrote: > >> Make sure you set HOD_CONF_DIR to /path/to/hod/conf >> Also make sure that, in the file /path/to/hod/conf/hodrc you set >> "java-home" >> (under both [hod] and [hodring] ) to a working JRE or JDK in your system. >> >> Does that work? >> >> Antonio >> >> On Tue, Mar 23, 2010 at 3:34 PM, Boyu Zhang <boyuzhan...@gmail.com> wrote: >> >> > thanks a lot! I found out the HOD_PYTHON_HOME error too: ) >> > >> > I had a new error after I use the correct version of Python: >> > >> > ---------------------------------------------------------- >> > INFO - Cluster Id 62.geronimo.xxx.xxx.xxx.edu >> > CRITICAL - Cluster could not be allocated because of the following errors >> > on >> > the ringmaster host n3. >> > Could not retrive the version of Hadoop. Check the Hadoop installation or >> > the value of the hodring.java-home variable. >> > CRITICAL - Cannot allocate cluster /home/zhang/cluster >> > >> > >> ----------------------------------------------------------------------------------------- >> > >> > Is that because I set my java-home wrong? Thanks! >> > >> > Boyu >> > >> > On Tue, Mar 23, 2010 at 10:04 AM, Antonio D'Ettole <coda...@gmail.com >> > >wrote: >> > >> > > Boyu, >> > > I've found that only Python 2.5.x works with HOD. Version 2.6.x will >> give >> > > you the exception. >> > > You should set HOD_PYTHON_HOME to the path to a 2.5.x executable (not >> the >> > > directory). >> > > >> > > Antonio >> > > >> > > On Mon, Mar 22, 2010 at 5:07 PM, Boyu Zhang <boyuzhan...@gmail.com> >> > wrote: >> > > >> > > > Updata: I used the command: $ bin/hod allocate -d /home/zhang/cluster >> > -n >> > > 4 >> > > > -c /home/zhang/hadoop-0.20.2/contrib/hod/conf/hodrc -t >> > > > /home/zhang/hadoop-0.20.2.tar.gz -b 4 >> > > > >> > > > and I get the errot: Using Python: 2.4.3 (#1, Sep 3 2009, 15:37:37) >> > > > [GCC 4.1.2 20080704 (Red Hat 4.1.2-46)] >> > > > >> > > > Uncaught Exception : need more than 2 values to unpack >> > > > >> > > > I have multiple versions of python running on my system, and I set >> > > env-vars >> > > > in hodrc file to point to the python 2.6.5 version( >> > > > HOD_PYTHON_HOME=/opt/python/2.6.5/bin/python >> > > > ). Do I need to do anything else, like export the HOD_PYTHON_HOM >> > > > environment >> > > > variable? Thanks a lot! >> > > > >> > > > >> > > > On Mon, Mar 22, 2010 at 11:52 AM, Boyu Zhang <boyuzhan...@gmail.com> >> > > > wrote: >> > > > >> > > > > Dear All, >> > > > > >> > > > > I have been trying to get HOD working on a cluster running Scyld. >> But >> > > > there >> > > > > are some problems. I configured the minimum configurations. >> > > > > >> > > > > 1. I executed the command: >> > > > > $ bin/hod allocate -d /home/zhang/cluster -n 4 -c >> > > > > /home/zhang/hadoop-0.20.2/contrib/hod/conf/hodrc -t >> > > > > /home/zhang/hadoop-0.20.2.tar.gz >> > > > > I get the error: file hod, line 576, finally: Syntax error. So I >> > > > commented >> > > > > out the line 576, and try again. >> > > > > >> > > > > 2. #[zh...@geronimo hod]$ bin/hod allocate -d /home/zhang/cluster >> > -n >> > > 4 >> > > > -c >> > > > > /home/zhang/hadoop-0.20.2/contrib/hod/conf/hodrc -t >> > > > > /home/zhang/hadoop-0.20.2.tar.gz >> > > > > Uncaught Exception : need more than 2 values to unpack >> > > > > >> > > > > Could anyone tell me why am I having this error? Is the problem the >> > > > > operating system, or Torque, or because I commented out line 576, >> or >> > > > > anything else? >> > > > > >> > > > > Any comment is welcome and appreciated. Thanks a lot! >> > > > > >> > > > > Sincerely, >> > > > > >> > > > > Boyu Zhang >> > > > > >> > > > >> > > >> > >>