Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change 
notification.

The "HarmonyMapreduce" page has been changed by GuillermoCabrera.
http://wiki.apache.org/hadoop/HarmonyMapreduce

--------------------------------------------------

New page:
= Hadoop hdfs on Harmony/Harmony Select 6 =

Similar to Hadoop common and hdfs, many of the steps outlined in this page 
could be skipped. The build process is done without building native libraries 
as there are some issues we describe below. Under scripts and patches below 
there are currently 21 failures for this project, out of which 2 fail because 
of an explicit call that is made in the test case to sun/reflect from the 
[[http://mockito.org/|mockito]] project.

=== Environment Setup ===

 1. Download missing dependencies: Apache Forrest, Xerces-C++ Parser, Apache 
Ant, Apache Maven, Java JRE 5.0 (Needed by Apache Forrest)
 2. Download ecj jar from the 
[[http://download.eclipse.org/eclipse/downloads/drops/S-3.7M3-201010281441/index.php|3.7M3
 build]] (this particular build contains a fix to bug affecting build process), 
place it in `$HARMONY_HOME/hdk/jdk/lib` and add to bootclasspath.properties
 3. Add tools.jar from `$HARMONY_HOME/hdk/jdk/lib` to 
`$HARMONY_HOME/hdk/jdk/jre/lib/boot`. Then, add entry of tools.jar into 
bootclasspath.properties in `$HARMONY_HOME/hdk/jdk/jre/lib/boot`
 4. Create a jar file (sun-javadoc.jar) in `$HARMONY_HOME/hdk/jdk/jre/lib/boot` 
containing all javadoc related classes from SUN JDK 1.6. Then add entry of 
sun-javadoc.jar into bootclasspath.properties in 
`$HARMONY_HOME/hdk/jdk/jre/lib/boot`
 5. Download Hadoop mapreduce {{{
 % svn checkout 
http://svn.apache.org/repos/asf/hadoop/mapreduce/tags/release-0.21.0/ mapreduce
}}}
 6. Download patches and place in appropriate directory (refer to script)
 7. Download, modify and run build script

=== Testing ===

 1. Copy swing.jar from Harmony 6 into Harmony Select's `jre/lib/ext` (Ivy 
requires swing to run testing framework)
 2. Copy rmi.jar from Harmony 6 into Harmony Select's `jre/lib/boot` and add 
entry into `bootclasspath.properties`
 3. Comment out the line `xmlsec-1.4.3/commons-logging.jar` in Harmony Select's 
'jre/lib/boot/bootclasspath.properties`
 4. Create a jar file (sunExtra.jar) in `$HARMONY_HOME/hdk/jdk/jre/lib/boot` 
containing all the sun/reflect and sun/misc classes from SUN JDK 1.6. Then add 
entry of sunExtra.jar into bootclasspath.properties
 5. Create soft link to libharmonyvm.so from libjvm.so in Harmony Select's 
`jre/bin/default`
 6. Download, modify and run test script

=== Patches ===

 * [[https://issues.apache.org/jira/browse/MAPREDUCE-2183| MAPREDUCE-2183]]
 * [[https://issues.apache.org/jira/browse/MAPREDUCE-2190| MAPREDUCE-2190]]

=== Build Script ===

{{{
# !/bin/sh
export SUBPROJECT=mapreduce
export VERSION=0.21.0
export VERSIONBUILD=0.21.0-SNAPSHOT
export PATCH_DIR=/home/harmony/Hadoop-Patches/Harmony/$VERSION-$SUBPROJECT
#Folder containing a clean version of Hadoop common needed to install patches
export PRISTINE=/home/harmony/Hadoop-Versions/pristine
#Note we are using SUN JDK, please refer below to the issues section and why we 
don't use Harmony
export JAVA_HOME=/home/harmony/Java-Versions/jdk1.6.0_14
export HADOOP_INSTALL=/home/harmony/Hadoop-Versions/hadoop-$VERSION
export FORREST_INSTALL=/home/harmony/Test-Dependencies/apache-forrest-0.8
export XERCES_INSTALL=/home/harmony/Test-Dependencies/xerces-c_2_8_0
export ANT_HOME=/home/harmony/Test-Dependencies/apache-ant-1.8.1
#Java 5 required by Forrest
export JAVA5=/home/harmony/Java-Versions/ibm-java2-i386-50/jre
export PATH=$PATH:$ANT_HOME/bin
export CFLAGS=-m32
export CXXFLAGS=-m32
set PATH=$PATH:$JAVA_HOME

#clean (Clean targets are necessary to apply patches)
echo "Cleaning and Copying From Pristine"
rm -rf $HADOOP_INSTALL/$SUBPROJECT
cp -r $PRISTINE/hadoop-$VERSION/$SUBPROJECT $HADOOP_INSTALL/$SUBPROJECT

# Apply Patches
echo "Applying Patches"
cd $HADOOP_INSTALL/$SUBPROJECT

patch -p0 < $PATCH_DIR/MAPREDUCE-2183.patch
patch -p0 < $PATCH_DIR/MAPREDUCE-2190.patch

echo "Starting Bulid"
ant -Dversion=$VERSIONBUILD -Dcompile.c++=true -Dlibrecordio=true 
-Dxercescroot=$XERCES_INSTALL -forrest.home=$FORREST_INSTALL 
-Djava5.home=$JAVA5 mvn-install -Dresolvers=internal > 
/home/harmony/Test-Scripts/Hadoop-$VERSION/SUN32build-$SUBPROJECT-noNative.out 
2>&1
}}}

=== Test Script ===
The following script only runs (does not compile) all of the tests in the 
test-core target. By running this script it is assumed that you already built 
Hadoop common with another JDK.
{{{
# !/bin/sh
export SUBPROJECT=mapreduce
export VERSION=0.21.0
export VERSIONBUILD=0.21.0-SNAPSHOT
export 
JAVA_HOME=/home/harmony/Java-Versions/harmonySelect6-1022137/java6/target/hdk/jdk
export HADOOP_INSTALL=/home/harmony/Hadoop-Versions/hadoop-$VERSION
export ANT_HOME=/home/harmony/Test-Dependencies/apache-ant-1.8.1
export PATH=$PATH:$ANT_HOME/bin
set PATH=$PATH:$JAVA_HOME

cd $HADOOP_INSTALL/$SUBPROJECT

echo "Testing Hadoop Common"
ant -Dsun.arch.data.model=32 -Dversion=$VERSIONBUILD run-test-core-nocompile 
-Dresolvers=internal > 
/home/harmony/Test-Scripts/Hadoop-$VERSION/HSTest-$SUBPROJECT.out 2>&1
}}}

Note: To run a single test case, you need to add the -Dtestcase=testClass 
property to the ant execution line.

=== Issues ===

==== Building Native ====

Building native libraries has been included in the build script above, thus, 
there is no need to build using native, or copy native libraries as was done 
with Hadoop common and hdfs.
}}}

For further reference, please note the guidelines when trying to build native 
libraries, 
[[http://hadoop.apache.org/common/docs/current/native_libraries.html|Native 
Libraries Guide]]

==== Others ====

 * Mapreduce does not compile when using Apache Harmony 6. There is 1 error in 
this project in the SpillRecord.java file (line 141) where we get the message 
"The type IndexRecord is already defined". Initial guess is that this is 
related to ecj as we had a similar problem with primary types before. For this 
reason we are using SUN to build mapreduce.

Reply via email to