See <https://builds.apache.org/job/Hadoop-0.20.203-Build/13/changes>
Changes: [gkesavan] Fix mr279 build script [gkesavan] Build scripts for MR-279 branch ------------------------------------------ [...truncated 11699 lines...] [exec] checking for xlf... no [exec] checking for f77... no [exec] checking for frt... no [exec] checking for pgf77... no [exec] checking for cf77... no [exec] checking for fort77... no [exec] checking for fl32... no [exec] checking for af77... no [exec] checking for xlf90... no [exec] checking for f90... no [exec] checking for pgf90... no [exec] checking for pghpf... no [exec] checking for epcf90... no [exec] checking for gfortran... no [exec] checking for g95... no [exec] checking for xlf95... no [exec] checking for f95... no [exec] checking for fort... no [exec] checking for ifort... no [exec] checking for ifc... no [exec] checking for efc... no [exec] checking for pgf95... no [exec] checking for lf95... no [exec] checking for ftn... no [exec] checking whether we are using the GNU Fortran 77 compiler... no [exec] checking whether accepts -g... no [exec] checking the maximum length of command line arguments... 32768 [exec] checking command to parse /usr/bin/nm -B output from gcc object... ok [exec] checking for objdir... .libs [exec] checking for ar... ar [exec] checking for ranlib... ranlib [exec] checking for strip... strip [exec] checking if gcc static flag works... yes [exec] checking if gcc supports -fno-rtti -fno-exceptions... no [exec] checking for gcc option to produce PIC... -fPIC [exec] checking if gcc PIC flag -fPIC works... yes [exec] checking if gcc supports -c -o file.o... yes [exec] checking whether the gcc linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes [exec] checking whether -lc should be explicitly linked in... no [exec] checking dynamic linker characteristics... GNU/Linux ld.so [exec] checking how to hardcode library paths into programs... immediate [exec] checking whether stripping libraries is possible... yes [exec] checking if libtool supports shared libraries... yes [exec] checking whether to build shared libraries... yes [exec] checking whether to build static libraries... yes [exec] configure: creating libtool [exec] appending configuration tag "CXX" to libtool [exec] checking for ld used by g++... /usr/bin/ld -m elf_x86_64 [exec] checking if the linker (/usr/bin/ld -m elf_x86_64) is GNU ld... yes [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes [exec] checking for g++ option to produce PIC... -fPIC [exec] checking if g++ PIC flag -fPIC works... yes [exec] checking if g++ supports -c -o file.o... yes [exec] checking whether the g++ linker (/usr/bin/ld -m elf_x86_64) supports shared libraries... yes [exec] checking dynamic linker characteristics... GNU/Linux ld.so [exec] checking how to hardcode library paths into programs... immediate [exec] checking whether stripping libraries is possible... yes [exec] appending configuration tag "F77" to libtool [exec] checking for unistd.h... (cached) yes [exec] checking for stdbool.h that conforms to C99... yes [exec] checking for _Bool... no [exec] checking for an ANSI C-conforming const... yes [exec] checking for off_t... yes [exec] checking for size_t... yes [exec] checking whether strerror_r is declared... yes [exec] checking for strerror_r... yes [exec] checking whether strerror_r returns char *... yes [exec] checking for mkdir... yes [exec] checking for uname... yes [exec] checking for shutdown in -lsocket... no [exec] checking for xdr_float in -lnsl... yes [exec] configure: creating ./config.status [exec] config.status: creating Makefile [exec] config.status: creating impl/config.h [exec] config.status: impl/config.h is unchanged [exec] config.status: executing depfiles commands compile-c++-examples-pipes: [exec] depbase=`echo impl/wordcount-simple.o | sed 's|[^/]*$|.deps/&|;s|\.o$||'`; \ [exec] if g++ -DHAVE_CONFIG_H -I. -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/src/examples/pipes> -I./impl -Wall -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -g -O2 -MT impl/wordcount-simple.o -MD -MP -MF "$depbase.Tpo" -c -o impl/wordcount-simple.o <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/src/examples/pipes/impl/wordcount-simple.cc;> \ [exec] then mv -f "$depbase.Tpo" "$depbase.Po"; else rm -f "$depbase.Tpo"; exit 1; fi [exec] /bin/bash ./libtool --mode=link --tag=CXX g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -g -O2 -o wordcount-simple impl/wordcount-simple.o -L<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/lib> -L<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread [exec] mkdir .libs [exec] g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -g -O2 -o wordcount-simple impl/wordcount-simple.o -L<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread [exec] depbase=`echo impl/wordcount-part.o | sed 's|[^/]*$|.deps/&|;s|\.o$||'`; \ [exec] if g++ -DHAVE_CONFIG_H -I. -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/src/examples/pipes> -I./impl -Wall -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -g -O2 -MT impl/wordcount-part.o -MD -MP -MF "$depbase.Tpo" -c -o impl/wordcount-part.o <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/src/examples/pipes/impl/wordcount-part.cc;> \ [exec] then mv -f "$depbase.Tpo" "$depbase.Po"; else rm -f "$depbase.Tpo"; exit 1; fi [exec] /bin/bash ./libtool --mode=link --tag=CXX g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -g -O2 -o wordcount-part impl/wordcount-part.o -L<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/lib> -L<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread [exec] g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -g -O2 -o wordcount-part impl/wordcount-part.o -L<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread [exec] depbase=`echo impl/wordcount-nopipe.o | sed 's|[^/]*$|.deps/&|;s|\.o$||'`; \ [exec] if g++ -DHAVE_CONFIG_H -I. -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/src/examples/pipes> -I./impl -Wall -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -g -O2 -MT impl/wordcount-nopipe.o -MD -MP -MF "$depbase.Tpo" -c -o impl/wordcount-nopipe.o <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/src/examples/pipes/impl/wordcount-nopipe.cc;> \ [exec] then mv -f "$depbase.Tpo" "$depbase.Po"; else rm -f "$depbase.Tpo"; exit 1; fi [exec] /bin/bash ./libtool --mode=link --tag=CXX g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -g -O2 -o wordcount-nopipe impl/wordcount-nopipe.o -L<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/lib> -L<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread [exec] g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -g -O2 -o wordcount-nopipe impl/wordcount-nopipe.o -L<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread [exec] depbase=`echo impl/sort.o | sed 's|[^/]*$|.deps/&|;s|\.o$||'`; \ [exec] if g++ -DHAVE_CONFIG_H -I. -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/src/examples/pipes> -I./impl -Wall -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -g -O2 -MT impl/sort.o -MD -MP -MF "$depbase.Tpo" -c -o impl/sort.o <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/src/examples/pipes/impl/sort.cc;> \ [exec] then mv -f "$depbase.Tpo" "$depbase.Po"; else rm -f "$depbase.Tpo"; exit 1; fi [exec] /bin/bash ./libtool --mode=link --tag=CXX g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -g -O2 -o pipes-sort impl/sort.o -L<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/lib> -L<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread [exec] g++ -Wall -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -I<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/include> -g -O2 -o pipes-sort impl/sort.o -L<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++/Linux-i386-32/lib> -lhadooppipes -lhadooputils -lnsl -lssl -lpthread [exec] make[1]: Entering directory `<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++-build/Linux-i386-32/examples/pipes'> [exec] test -z "<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++-examples/Linux-i386-32/bin"> || mkdir -p -- "<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++-examples/Linux-i386-32/bin"> [exec] /bin/bash ./libtool --mode=install /usr/bin/install -c 'wordcount-simple' '<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++-examples/Linux-i386-32/bin/wordcount-simple'> [exec] /usr/bin/install -c wordcount-simple <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++-examples/Linux-i386-32/bin/wordcount-simple> [exec] /bin/bash ./libtool --mode=install /usr/bin/install -c 'wordcount-part' '<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++-examples/Linux-i386-32/bin/wordcount-part'> [exec] /usr/bin/install -c wordcount-part <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++-examples/Linux-i386-32/bin/wordcount-part> [exec] /bin/bash ./libtool --mode=install /usr/bin/install -c 'wordcount-nopipe' '<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++-examples/Linux-i386-32/bin/wordcount-nopipe'> [exec] /usr/bin/install -c wordcount-nopipe <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++-examples/Linux-i386-32/bin/wordcount-nopipe> [exec] /bin/bash ./libtool --mode=install /usr/bin/install -c 'pipes-sort' '<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++-examples/Linux-i386-32/bin/pipes-sort'> [exec] /usr/bin/install -c pipes-sort <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++-examples/Linux-i386-32/bin/pipes-sort> [exec] make[1]: Nothing to be done for `install-data-am'. [exec] make[1]: Leaving directory `<https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/c++-build/Linux-i386-32/examples/pipes'> compile-c++-examples: compile-examples: generate-test-records: compile-core-test: [javac] Compiling 7 source files to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/classes> [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. [javac] Compiling 1 source file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/classes> [javac] Compiling 7 source files to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/testjar> [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [delete] Deleting: <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/testjar/testjob.jar> [jar] Building jar: <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/testjar/testjob.jar> [javac] Compiling 1 source file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/testshell> [javac] Note: <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/src/test/testshell/ExternalMapReduce.java> uses or overrides a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [delete] Deleting: <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/testshell/testshell.jar> [jar] Building jar: <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/testshell/testshell.jar> [delete] Deleting directory <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/cache> [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/cache> [delete] Deleting directory <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/debug> [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/debug> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/debug> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/cache> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/cache> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/cache> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/cache> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/cache> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/cache> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/cache> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/cache> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/cache> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/cache> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/cache> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/cache> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/test/cache> test-contrib: test: Trying to override old definition of task macro_tar check-contrib: init: [echo] contrib: hdfsproxy init-contrib: ivy-download: [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar [get] To: <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/ivy/ivy-2.1.0.jar> [get] Not modified - so not downloaded ivy-probe-antlib: ivy-init-antlib: ivy-init: ivy-resolve-common: [ivy:resolve] :: loading settings :: file = <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/ivy/ivysettings.xml> [ivy:resolve] :: resolving dependencies :: org.apache.hadoop#hdfsproxy;work...@h8.grid.sp2.yahoo.net [ivy:resolve] confs: [common] [ivy:resolve] found commons-httpclient#commons-httpclient;3.0.1 in maven2 [ivy:resolve] found commons-logging#commons-logging;1.0.4 in maven2 [ivy:resolve] found commons-cli#commons-cli;1.2 in maven2 [ivy:resolve] found log4j#log4j;1.2.15 in maven2 [ivy:resolve] found commons-logging#commons-logging-api;1.0.4 in maven2 [ivy:resolve] found junit#junit;4.5 in maven2 [ivy:resolve] found org.slf4j#slf4j-api;1.4.3 in maven2 [ivy:resolve] found org.slf4j#slf4j-log4j12;1.4.3 in maven2 [ivy:resolve] found xmlenc#xmlenc;0.52 in maven2 [ivy:resolve] found org.mortbay.jetty#jetty;6.1.26 in maven2 [ivy:resolve] found org.mortbay.jetty#jetty-util;6.1.26 in maven2 [ivy:resolve] found org.mortbay.jetty#servlet-api;2.5-20081211 in maven2 [ivy:resolve] found org.eclipse.jdt#core;3.1.1 in maven2 [ivy:resolve] found org.codehaus.jackson#jackson-mapper-asl;1.0.1 in maven2 [ivy:resolve] found org.codehaus.jackson#jackson-core-asl;1.0.1 in maven2 [ivy:resolve] found commons-configuration#commons-configuration;1.6 in maven2 [ivy:resolve] found commons-collections#commons-collections;3.2.1 in maven2 [ivy:resolve] found commons-lang#commons-lang;2.4 in maven2 [ivy:resolve] found commons-logging#commons-logging;1.1.1 in maven2 [ivy:resolve] found commons-digester#commons-digester;1.8 in maven2 [ivy:resolve] found commons-beanutils#commons-beanutils;1.7.0 in maven2 [ivy:resolve] found commons-beanutils#commons-beanutils-core;1.8.0 in maven2 [ivy:resolve] found org.apache.commons#commons-math;2.1 in maven2 [ivy:resolve] :: resolution report :: resolve 111ms :: artifacts dl 6ms [ivy:resolve] :: evicted modules: [ivy:resolve] commons-logging#commons-logging;1.0.4 by [commons-logging#commons-logging;1.1.1] in [common] [ivy:resolve] commons-logging#commons-logging;1.0.3 by [commons-logging#commons-logging;1.1.1] in [common] [ivy:resolve] commons-logging#commons-logging;1.1 by [commons-logging#commons-logging;1.1.1] in [common] --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | common | 25 | 0 | 0 | 3 || 22 | 0 | --------------------------------------------------------------------- ivy-retrieve-common: [ivy:retrieve] :: retrieving :: org.apache.hadoop#hdfsproxy [sync] [ivy:retrieve] confs: [common] [ivy:retrieve] 0 artifacts copied, 22 already retrieved (0kB/4ms) [ivy:cachepath] :: loading settings :: file = <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/ivy/ivysettings.xml> compile: [echo] contrib: hdfsproxy compile-examples: compile-test: [echo] contrib: hdfsproxy [javac] Compiling 5 source files to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build/contrib/hdfsproxy/test> test-junit: [copy] Copying 11 files to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/src/contrib/hdfsproxy/src/test/resources/proxy-config> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/src/contrib/hdfsproxy/src/test/resources> [copy] Copying 1 file to <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/src/contrib/hdfsproxy/src/test/resources> [junit] Running org.apache.hadoop.hdfsproxy.TestHdfsProxy [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 5.127 sec [junit] Test org.apache.hadoop.hdfsproxy.TestHdfsProxy FAILED BUILD FAILED <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build.xml>:1083: The following error occurred while executing this line: <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/build.xml>:1072: The following error occurred while executing this line: <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/src/contrib/build.xml>:51: The following error occurred while executing this line: <https://builds.apache.org/job/Hadoop-0.20.203-Build/ws/trunk/src/contrib/hdfsproxy/build.xml>:278: Tests failed! Total time: 213 minutes 57 seconds Recording test results Publishing Javadoc Archiving artifacts Recording fingerprints Description set: