Hi Steve, Thank you very much for this email. I sent an email and reported an compilation error before I got your email. I am going to try your script and let you know.
Thank you again, Gary --- On Mon, 3/8/10, Stephen Watt <sw...@us.ibm.com> wrote: > From: Stephen Watt <sw...@us.ibm.com> > Subject: Re: Compilation failed when compile hadoop common release-0.20.2 > To: general@hadoop.apache.org > Date: Monday, March 8, 2010, 12:24 PM > Hi Gary > > This is a script I put together based on the WikiPage link > Owen sent you > that will build most versions of Hadoop including 0.20.2 > . You are > welcome to use it. Notice how I have JAVA_HOME point to > java 6 and JAVA5 > point to java 5 (for Forrest). In this script its pointing > to IBM Java > installations, but you can use a Sun/Oracle JDK if you wish > to as well. > The Wiki Page should be able to answer other details. > > #!/bin/sh > export VERSION=0.20.2 > set > PATH=$PATH:/home/hadoop/Java-Versions/ibm-java-i386-60/bin/ > export > HADOOP_INSTALL=/home/hadoop/Hadoop-Versions/hadoop-$VERSION > export > FORREST_INSTALL=/home/hadoop/Test-Dependencies/apache-forrest-0.8 > export > XERCES_INSTALL=/home/hadoop/Test-Dependencies/xerces-c_2_8_0 > export > ANT_HOME=/home/hadoop/Test-Dependencies/apache-ant-1.7.1 > export > JAVA_HOME=/home/hadoop/Java-Versions/ibm-java-i386-60 > export JAVA5=/home/hadoop/Java-Versions/ibm-java2-i386-50 > export CFLAGS=-m32 > export CXXFLAGS=-m32 > export PATH=$PATH:$ANT_HOME/bin > > cd $HADOOP_INSTALL > > # For some reason these scripts do not have execute > permissions > chmod 777 src/c++/utils/configure > chmod 777 src/examples/pipes/configure > chmod 777 src/native/configure > > # Clean, Build and Run the Core (Non-Contrib) Unit Tests > ant -Dversion=$VERSION -Dcompile.native=true > -Dcompile.c++=true > -Dlibhdfs=1 -Dlibrecordio=true > -Dxercescroot=$XERCES_INSTALL > -Dforrest.home=$FORREST_INSTALL -Djava5.home=$JAVA5 clean > tar test-core > > /home/hadoop/Test-Scripts/Hadoop-$VERSION/ibm32build.out > > Kind regards > Steve Watt > > > > From: > Gary Yang <garyya...@yahoo.com> > To: > general@hadoop.apache.org > Date: > 03/08/2010 12:07 PM > Subject: > Re: Compilation failed when compile hadoop common > release-0.20.2 > > > > Hi Owen, > > Thanks for the reply. From the link you provided, I found > the build > instruction. I do not understand the option, > "-Djava5.home=/usr/local/jdk1.5". Does it mean I have to > use JDK 1.5? I > read somewhere it suggested to use JDK 1.6. > > Also, the very first line is "export > JAVA_HOME=/path/to/32bit/jdk > ". Does it mean the JAVA_HOME jdk has to be 1.5? > Please let me know. > > > export JAVA_HOME=/path/to/32bit/jdk > export CFLAGS=-m32 > export CXXFLAGS=-m32 > ant -Dversion=X.Y.Z -Dcompile.native=true > -Dcompile.c++=true -Dlibhdfs=1 > -Dlibrecordio=true -Dxercescroot=/usr/local/xerces-c > -Declipse.home=/usr/lib/eclipse > -Dforrest.home=/usr/local/forrest > -Djava5.home=/usr/local/jdk1.5 clean api-report tar test > test-c++-libhdfs > export JAVA_HOME=/path/to/64bit/jdk > export CFLAGS=-m64 > export CXXFLAGS=-m64 > ant -Dversion=X.Y.Z -Dcompile.native=true > -Dcompile.c++=true > compile-core-native compile-c++ tar > > > Thanks, > > > Gary > > > > --- On Fri, 3/5/10, Owen O'Malley <omal...@apache.org> > wrote: > > > From: Owen O'Malley <omal...@apache.org> > > Subject: Re: Compilation failed when compile hadoop > common > release-0.20.2 > > To: general@hadoop.apache.org > > Date: Friday, March 5, 2010, 4:32 PM > > > > On Mar 5, 2010, at 3:47 PM, Gary Yang wrote: > > > > > Hi, > > > > > > I try to compile hadoop common of the release > 0.20.2. > > Below are the error messages and java and ant versions > I am > > using. Please tell me what I missed. > > > > Forrest, which we use to generate the documentation, > > requires java 5. Therefore, run: > > > > ant -Djava5.home=/some/path/to/java5 tar > > > > There are several more you need. For a more complete > list, > > I'd look at the how to release page: > > > > http://wiki.apache.org/hadoop/HowToRelease > > > > -- Owen > > > > > > > >