See <https://builds.apache.org/job/Hadoop-Common-trunk/206/changes>

Changes:

[vinodkv] MAPREDUCE-2708. Designed and implemented MR Application Master 
recovery to make MR AMs resume their progress after restart. Contributed by 
Sharad Agarwal.

[shv] HDFS-2452. OutOfMemoryError in DataXceiverServer takes down the DataNode. 
Contributed by Uma Maheswara Rao.

[stevel] HDFS-2485

[stevel] HDFS-2485

------------------------------------------
[...truncated 7756 lines...]
[INFO] 
[INFO] --- maven-install-plugin:2.3.1:install (default-install) @ 
hadoop-mapreduce-client-jobclient ---
[INFO] Installing 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/hadoop-mapreduce-client-jobclient-0.24.0-SNAPSHOT.jar>
 to 
/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-jobclient/0.24.0-SNAPSHOT/hadoop-mapreduce-client-jobclient-0.24.0-SNAPSHOT.jar
[INFO] Installing 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/pom.xml>
 to 
/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-jobclient/0.24.0-SNAPSHOT/hadoop-mapreduce-client-jobclient-0.24.0-SNAPSHOT.pom
[INFO] Installing 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/hadoop-mapreduce-client-jobclient-0.24.0-SNAPSHOT-tests.jar>
 to 
/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-jobclient/0.24.0-SNAPSHOT/hadoop-mapreduce-client-jobclient-0.24.0-SNAPSHOT-tests.jar
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building hadoop-mapreduce 0.24.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-mapreduce ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-mapreduce-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (tar) @ hadoop-mapreduce ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-install-plugin:2.3.1:install (default-install) @ 
hadoop-mapreduce ---
[INFO] Installing 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-mapreduce-project/pom.xml>
 to 
/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-mapreduce/0.24.0-SNAPSHOT/hadoop-mapreduce-0.24.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop Tools 0.24.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ 
hadoop-tools-project ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-tools-project 
---
[INFO] Executing tasks

main:
    [mkdir] Created dir: 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-tools/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-install-plugin:2.3.1:install (default-install) @ 
hadoop-tools-project ---
[INFO] Installing 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-tools/pom.xml>
 to 
/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-tools-project/0.24.0-SNAPSHOT/hadoop-tools-project-0.24.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop Distribution 0.24.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-dist ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-dist ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-dist/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.4.3:resources (default-resources) @ 
hadoop-dist ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-dist/src/main/resources>
[INFO] 
[INFO] --- maven-compiler-plugin:2.3.2:compile (default-compile) @ hadoop-dist 
---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-resources-plugin:2.4.3:testResources (default-testResources) @ 
hadoop-dist ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-dist/src/test/resources>
[INFO] 
[INFO] --- maven-compiler-plugin:2.3.2:testCompile (default-testCompile) @ 
hadoop-dist ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.9:test (default-test) @ hadoop-dist ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-dist ---
[WARNING] JAR will be empty - no content was marked for inclusion!
[INFO] Building jar: 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-dist/target/hadoop-dist-0.24.0-SNAPSHOT.jar>
[INFO] 
[INFO] --- maven-install-plugin:2.3.1:install (default-install) @ hadoop-dist 
---
[INFO] Installing 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-dist/target/hadoop-dist-0.24.0-SNAPSHOT.jar>
 to 
/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-dist/0.24.0-SNAPSHOT/hadoop-dist-0.24.0-SNAPSHOT.jar
[INFO] Installing 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-dist/pom.xml>
 to 
/home/jenkins/.m2/repository/org/apache/hadoop/hadoop-dist/0.24.0-SNAPSHOT/hadoop-dist-0.24.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................ SUCCESS [0.683s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [0.409s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [1.462s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [0.366s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.142s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [2.030s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [0.977s]
[INFO] Apache Hadoop Common .............................. SUCCESS [22.646s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.024s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [16.404s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.024s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.084s]
[INFO] hadoop-yarn-api ................................... SUCCESS [6.307s]
[INFO] hadoop-yarn-common ................................ SUCCESS [9.759s]
[INFO] hadoop-yarn-server ................................ SUCCESS [0.064s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [3.045s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [4.814s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [6.225s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [1.144s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.054s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [1.968s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.084s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.050s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [10.335s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [5.500s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [1.513s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [5.930s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [2.145s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [1.974s]
[INFO] hadoop-mapreduce .................................. SUCCESS [0.076s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.022s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [0.151s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:47.199s
[INFO] Finished at: Mon Oct 24 09:03:02 UTC 2011
[INFO] Final Memory: 110M/774M
[INFO] ------------------------------------------------------------------------
+ cd hadoop-common-project
+ /home/jenkins/tools/maven/latest/bin/mvn clean verify checkstyle:checkstyle 
findbugs:findbugs -DskipTests -Pdist -Dtar -Psrc -Pnative -Pdocs
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] Apache Hadoop Annotations
[INFO] Apache Hadoop Auth
[INFO] Apache Hadoop Auth Examples
[INFO] Apache Hadoop Common
[INFO] Apache Hadoop Common Project
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop Annotations 0.24.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-annotations 
---
[INFO] Deleting 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-annotations/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-annotations 
---
[INFO] Executing tasks

main:
    [mkdir] Created dir: 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-annotations/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.4.3:resources (default-resources) @ 
hadoop-annotations ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-annotations/src/main/resources>
[INFO] 
[INFO] --- maven-compiler-plugin:2.3.2:compile (default-compile) @ 
hadoop-annotations ---
[INFO] Compiling 7 source files to 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-annotations/target/classes>
[INFO] 
[INFO] --- maven-resources-plugin:2.4.3:testResources (default-testResources) @ 
hadoop-annotations ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-annotations/src/test/resources>
[INFO] 
[INFO] --- maven-compiler-plugin:2.3.2:testCompile (default-testCompile) @ 
hadoop-annotations ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.9:test (default-test) @ hadoop-annotations 
---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-annotations ---
[INFO] Building jar: 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-annotations/target/hadoop-annotations-0.24.0-SNAPSHOT.jar>
[INFO] 
[INFO] --- maven-assembly-plugin:2.2-beta-3:single (src-dist) @ 
hadoop-annotations ---
[INFO] Reading assembly descriptor: 
hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Annotations ......................... FAILURE [2.138s]
[INFO] Apache Hadoop Auth ................................ SKIPPED
[INFO] Apache Hadoop Auth Examples ....................... SKIPPED
[INFO] Apache Hadoop Common .............................. SKIPPED
[INFO] Apache Hadoop Common Project ...................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2.978s
[INFO] Finished at: Mon Oct 24 09:03:07 UTC 2011
[INFO] Final Memory: 15M/150M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-3:single (src-dist) on 
project hadoop-annotations: Error reading assemblies: Error locating assembly 
descriptor: hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml
[ERROR] 
[ERROR] [1] [INFO] Searching for file location: 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-annotations/hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml>
[ERROR] 
[ERROR] [2] [INFO] File: 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-annotations/hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml>
 does not exist.
[ERROR] 
[ERROR] [3] [INFO] Invalid artifact specification: 
'hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml'. Must contain 
at least three fields, separated by ':'.
[ERROR] 
[ERROR] [4] [INFO] Failed to resolve classpath resource: 
/assemblies/hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml from 
classloader: 
ClassRealm[plugin>org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-3, 
parent: sun.misc.Launcher$AppClassLoader@192d342]
[ERROR] 
[ERROR] [5] [INFO] Failed to resolve classpath resource: 
hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml from 
classloader: 
ClassRealm[plugin>org.apache.maven.plugins:maven-assembly-plugin:2.2-beta-3, 
parent: sun.misc.Launcher$AppClassLoader@192d342]
[ERROR] 
[ERROR] [6] [INFO] File: 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml>
 does not exist.
[ERROR] 
[ERROR] [7] [INFO] Building URL from location: 
hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml
[ERROR] Error:
[ERROR] java.net.MalformedURLException: no protocol: 
hadoop-assemblies/src/main/resources/assemblies/hadoop-src.xml
[ERROR] at java.net.URL.<init>(URL.java:567)
[ERROR] at java.net.URL.<init>(URL.java:464)
[ERROR] at java.net.URL.<init>(URL.java:413)
[ERROR] at 
org.apache.maven.shared.io.location.URLLocatorStrategy.resolve(URLLocatorStrategy.java:54)
[ERROR] at org.apache.maven.shared.io.location.Locator.resolve(Locator.java:81)
[ERROR] at 
org.apache.maven.plugin.assembly.io.DefaultAssemblyReader.addAssemblyFromDescriptor(DefaultAssemblyReader.java:309)
[ERROR] at 
org.apache.maven.plugin.assembly.io.DefaultAssemblyReader.readAssemblies(DefaultAssemblyReader.java:140)
[ERROR] at 
org.apache.maven.plugin.assembly.mojos.AbstractAssemblyMojo.execute(AbstractAssemblyMojo.java:328)
[ERROR] at 
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
[ERROR] at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
[ERROR] at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
[ERROR] at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
[ERROR] at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
[ERROR] at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
[ERROR] at 
org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
[ERROR] at 
org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
[ERROR] at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:319)
[ERROR] at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
[ERROR] at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
[ERROR] at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
[ERROR] at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
[ERROR] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[ERROR] at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
[ERROR] at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
[ERROR] at java.lang.reflect.Method.invoke(Method.java:597)
[ERROR] at 
org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
[ERROR] at 
org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
[ERROR] at 
org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
[ERROR] at 
org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
+ /home/jenkins/tools/maven/latest/bin/mvn test -Pclover 
-DcloverLicenseLocation=/home/jenkins/tools/clover/latest/lib/clover.license
Build step 'Execute shell' marked build as failure
[WARNINGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
Clover xml file does not exist in: 
<https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/target/clover>
 called: clover.xml and will not be copied to: 
/home/hudson/hudson/jobs/Hadoop-Common-trunk/builds/2011-10-24_09-00-43/clover.xml
Could not find 
'trunk/hadoop-common-project/hadoop-common/target/clover//clover.xml'.  Did you 
generate the XML report for Clover?
Recording test results
Publishing Javadoc
Recording fingerprints

Reply via email to