Build failed in Jenkins: Hadoop-Common-0.23-Build #796
See https://builds.apache.org/job/Hadoop-Common-0.23-Build/796/ -- [...truncated 12209 lines...] 11190/11447 KB 11194/11447 KB 11198/11447 KB 11202/11447 KB 11206/11447 KB 11210/11447 KB 11214/11447 KB 11218/11447 KB 11222/11447 KB 11226/11447 KB 11230/11447 KB 11234/11447 KB 11238/11447 KB 11242/11447 KB 11246/11447 KB 11250/11447 KB 11254/11447 KB 11258/11447 KB 11262/11447 KB 11266/11447 KB 11270/11447 KB 11274/11447 KB 11278/11447 KB 11282/11447 KB 11286/11447 KB 11290/11447 KB 11294/11447 KB 11298/11447 KB 11302/11447 KB 11306/11447 KB 11310/11447 KB 11314/11447 KB 11318/11447 KB 11322/11447 KB 11326/11447 KB 11330/11447 KB 11334/11447 KB 11338/11447 KB 11342/11447 KB 11346/11447 KB 11350/11447 KB 11354/11447 KB 11358/11447 KB 11362/11447 KB 11366/11447 KB 11370/11447 KB 11374/11447 KB 11378/11447 KB 11382/11447 KB 11386/11447 KB 11390/11447 KB 11394/11447 KB 11398/11447 KB 11402/11447 KB 11406/11447 KB 11410/11447 KB 11414/11447 KB 11418/11447 KB 11422/11447 KB 11426/11447 KB 11430/11447 KB 11434/11447 KB 11438/11447 KB 11442/11447 KB 11446/11447 KB 11447/11447 KB Downloaded: http://repo.maven.apache.org/maven2/com/cenqua/clover/clover/3.0.2/clover-3.0.2.jar (11447 KB at 18112.3 KB/sec) [INFO] Clover Version 3.0.2, built on April 13 2010 (build-790) [INFO] Loaded from: /home/jenkins/.m2/repository/com/cenqua/clover/clover/3.0.2/clover-3.0.2.jar [INFO] Clover: Open Source License registered to Apache. [INFO] Creating new database at 'https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-annotations/target/clover/hadoop-coverage.db'. [INFO] Processing files at 1.6 source level. [INFO] Clover all over. Instrumented 8 files (2 packages). [INFO] Elapsed time = 0.192 secs. (41.667 files/sec, 3,187.5 srclines/sec) [INFO] No Clover instrumentation done on source files in: [https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-annotations/src/test/java] as no matching sources files found [INFO] [INFO] --- maven-resources-plugin:2.2:resources (default-resources) @ hadoop-annotations --- [INFO] Using default encoding to copy filtered resources. [INFO] [INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) @ hadoop-annotations --- [INFO] Compiling 8 source files to https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-annotations/target/classes [INFO] [INFO] --- maven-resources-plugin:2.2:testResources (default-testResources) @ hadoop-annotations --- [INFO] Using default encoding to copy filtered resources. [INFO] [INFO] --- maven-compiler-plugin:2.5.1:testCompile (default-testCompile) @ hadoop-annotations --- [INFO] No sources to compile [INFO] [INFO] --- maven-surefire-plugin:2.12.3:test (default-test) @ hadoop-annotations --- [INFO] No tests to run. [INFO] [INFO] --- maven-clover2-plugin:3.0.5:clover (clover) @ hadoop-annotations --- [INFO] Setting property: classpath.resource.loader.class = 'org.codehaus.plexus.velocity.ContextClassLoaderResourceLoader'. [INFO] Setting property: velocimacro.messages.on = 'false'. [INFO] Setting property: resource.loader = 'classpath'. [INFO] Setting property: resource.manager.logwhenfound = 'false'. [INFO] ** [INFO] Starting Jakarta Velocity v1.4 [INFO] RuntimeInstance initializing. [INFO] Default Properties File: org/apache/velocity/runtime/defaults/velocity.properties [INFO] Default ResourceManager initializing. (class org.apache.velocity.runtime.resource.ResourceManagerImpl) [INFO] Resource Loader Instantiated: org.codehaus.plexus.velocity.ContextClassLoaderResourceLoader [INFO] ClasspathResourceLoader : initialization starting. [INFO] ClasspathResourceLoader : initialization complete. [INFO] ResourceCache : initialized. (class org.apache.velocity.runtime.resource.ResourceCacheImpl) [INFO] Default ResourceManager initialization complete. [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Literal [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Macro [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Parse [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Include [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Foreach [INFO] Created: 20 parsers. [INFO] Velocimacro : initialization starting. [INFO] Velocimacro : adding VMs from VM library template : VM_global_library.vm [ERROR] ResourceManager : unable to find resource 'VM_global_library.vm' in any resource loader. [INFO] Velocimacro : error using VM library template VM_global_library.vm : org.apache.velocity.exception.ResourceNotFoundException: Unable to find
Build failed in Jenkins: Hadoop-Common-trunk #955
See https://builds.apache.org/job/Hadoop-Common-trunk/955/changes Changes: [umamahesh] HDFS-5372. In FSNamesystem, hasReadLock() returns false if the current thread holds the write lock (Contributed by Vinay) [jeagles] MAPREDUCE-5625. TestFixedLengthInputFormat fails in jdk7 environment (Mariappan Asokan via jeagles) -- [...truncated 59800 lines...] Adding reference: maven.local.repository [DEBUG] Initialize Maven Ant Tasks parsing buildfile jar:file:/home/jenkins/.m2/repository/org/apache/maven/plugins/maven-antrun-plugin/1.7/maven-antrun-plugin-1.7.jar!/org/apache/maven/ant/tasks/antlib.xml with URI = jar:file:/home/jenkins/.m2/repository/org/apache/maven/plugins/maven-antrun-plugin/1.7/maven-antrun-plugin-1.7.jar!/org/apache/maven/ant/tasks/antlib.xml from a zip file parsing buildfile jar:file:/home/jenkins/.m2/repository/org/apache/ant/ant/1.8.2/ant-1.8.2.jar!/org/apache/tools/ant/antlib.xml with URI = jar:file:/home/jenkins/.m2/repository/org/apache/ant/ant/1.8.2/ant-1.8.2.jar!/org/apache/tools/ant/antlib.xml from a zip file Class org.apache.maven.ant.tasks.AttachArtifactTask loaded from parent loader (parentFirst) +Datatype attachartifact org.apache.maven.ant.tasks.AttachArtifactTask Class org.apache.maven.ant.tasks.DependencyFilesetsTask loaded from parent loader (parentFirst) +Datatype dependencyfilesets org.apache.maven.ant.tasks.DependencyFilesetsTask Setting project property: test.build.dir - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target/test-dir Setting project property: test.exclude.pattern - _ Setting project property: hadoop.assemblies.version - 3.0.0-SNAPSHOT Setting project property: test.exclude - _ Setting project property: distMgmtSnapshotsId - apache.snapshots.https Setting project property: project.build.sourceEncoding - UTF-8 Setting project property: java.security.egd - file:///dev/urandom Setting project property: distMgmtSnapshotsUrl - https://repository.apache.org/content/repositories/snapshots Setting project property: distMgmtStagingUrl - https://repository.apache.org/service/local/staging/deploy/maven2 Setting project property: avro.version - 1.7.4 Setting project property: test.build.data - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target/test-dir Setting project property: commons-daemon.version - 1.0.13 Setting project property: hadoop.common.build.dir - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/../../hadoop-common-project/hadoop-common/target Setting project property: testsThreadCount - 4 Setting project property: maven.test.redirectTestOutputToFile - true Setting project property: jdiff.version - 1.0.9 Setting project property: distMgmtStagingName - Apache Release Distribution Repository Setting project property: project.reporting.outputEncoding - UTF-8 Setting project property: build.platform - Linux-i386-32 Setting project property: protobuf.version - 2.5.0 Setting project property: failIfNoTests - false Setting project property: protoc.path - ${env.HADOOP_PROTOC_PATH} Setting project property: jersey.version - 1.9 Setting project property: distMgmtStagingId - apache.staging.https Setting project property: distMgmtSnapshotsName - Apache Development Snapshot Repository Setting project property: ant.file - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/pom.xml [DEBUG] Setting properties with prefix: Setting project property: project.groupId - org.apache.hadoop Setting project property: project.artifactId - hadoop-common-project Setting project property: project.name - Apache Hadoop Common Project Setting project property: project.description - Apache Hadoop Common Project Setting project property: project.version - 3.0.0-SNAPSHOT Setting project property: project.packaging - pom Setting project property: project.build.directory - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target Setting project property: project.build.outputDirectory - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target/classes Setting project property: project.build.testOutputDirectory - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/target/test-classes Setting project property: project.build.sourceDirectory - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/src/main/java Setting project property: project.build.testSourceDirectory - https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/src/test/java Setting project property: localRepository -id: local url: file:///home/jenkins/.m2/repository/ layout: none Setting project property: settings.localRepository - /home/jenkins/.m2/repository Setting project property: maven.project.dependencies.versions - [INFO] Executing tasks Build sequence for target(s)
[jira] [Created] (HADOOP-10109) Fix test failure in TestOfflineEditsViewer introduced by HADOOP-10052
Colin Patrick McCabe created HADOOP-10109: - Summary: Fix test failure in TestOfflineEditsViewer introduced by HADOOP-10052 Key: HADOOP-10109 URL: https://issues.apache.org/jira/browse/HADOOP-10109 Project: Hadoop Common Issue Type: Sub-task Components: test Affects Versions: 2.2.1 Reporter: Colin Patrick McCabe Assignee: Colin Patrick McCabe Attachments: 0001-HADOOP-10020-addendum.-Fix-TestOfflineEditsViewer.patch Fix test failure in TestOfflineEditsViewer introduced by HADOOP-10052 -- This message was sent by Atlassian JIRA (v6.1#6144)
[jira] [Resolved] (HADOOP-10109) Fix test failure in TestOfflineEditsViewer introduced by HADOOP-10052
[ https://issues.apache.org/jira/browse/HADOOP-10109?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Colin Patrick McCabe resolved HADOOP-10109. --- Resolution: Fixed Fix Version/s: 2.2.1 Fix test failure in TestOfflineEditsViewer introduced by HADOOP-10052 - Key: HADOOP-10109 URL: https://issues.apache.org/jira/browse/HADOOP-10109 Project: Hadoop Common Issue Type: Sub-task Components: test Affects Versions: 2.2.1 Reporter: Colin Patrick McCabe Assignee: Colin Patrick McCabe Fix For: 2.2.1 Attachments: 0001-HADOOP-10020-addendum.-Fix-TestOfflineEditsViewer.patch Fix test failure in TestOfflineEditsViewer introduced by HADOOP-10052 -- This message was sent by Atlassian JIRA (v6.1#6144)
[jira] [Created] (HADOOP-10110) hadoop-auth has a build break due to missing dependency
Chuan Liu created HADOOP-10110: -- Summary: hadoop-auth has a build break due to missing dependency Key: HADOOP-10110 URL: https://issues.apache.org/jira/browse/HADOOP-10110 Project: Hadoop Common Issue Type: Bug Reporter: Chuan Liu Assignee: Chuan Liu Priority: Blocker We have a build break in hadoop-auth if build with maven cache cleaned. The error looks like the follows. The problem exists on both Windows and Linux. If you have old jetty jars in your maven cache, you won't see the error. {noformat} [INFO] [INFO] BUILD FAILURE [INFO] [INFO] Total time: 1:29.469s [INFO] Finished at: Mon Nov 18 12:30:36 PST 2013 [INFO] Final Memory: 37M/120M [INFO] [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:2.5.1:testCompile (default-testCompile) on project hadoop-auth: Compilation failure: Compilation failure: [ERROR] /home/chuan/trunk/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase.java:[84,13] cannot access org.mortbay.component.AbstractLifeCycle [ERROR] class file for org.mortbay.component.AbstractLifeCycle not found [ERROR] server = new Server(0); [ERROR] /home/chuan/trunk/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase.java:[94,29] cannot access org.mortbay.component.LifeCycle [ERROR] class file for org.mortbay.component.LifeCycle not found [ERROR] server.getConnectors()[0].setHost(host); [ERROR] /home/chuan/trunk/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase.java:[96,10] cannot find symbol [ERROR] symbol : method start() [ERROR] location: class org.mortbay.jetty.Server [ERROR] /home/chuan/trunk/hadoop-common-project/hadoop-auth/src/test/java/org/apache/hadoop/security/authentication/client/AuthenticatorTestCase.java:[102,12] cannot find symbol [ERROR] symbol : method stop() [ERROR] location: class org.mortbay.jetty.Server [ERROR] - [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn goals -rf :hadoop-auth {noformat} -- This message was sent by Atlassian JIRA (v6.1#6144)
[jira] [Created] (HADOOP-10111) Allow DU to be initialized with an initial value
Kihwal Lee created HADOOP-10111: --- Summary: Allow DU to be initialized with an initial value Key: HADOOP-10111 URL: https://issues.apache.org/jira/browse/HADOOP-10111 Project: Hadoop Common Issue Type: Improvement Reporter: Kihwal Lee -- This message was sent by Atlassian JIRA (v6.1#6144)
[jira] [Created] (HADOOP-10112) har file listing doesn't work with wild card
Brandon Li created HADOOP-10112: --- Summary: har file listing doesn't work with wild card Key: HADOOP-10112 URL: https://issues.apache.org/jira/browse/HADOOP-10112 Project: Hadoop Common Issue Type: Bug Components: tools Reporter: Brandon Li [test@test001 root]$ hdfs dfs -ls har:///tmp/filename.har/* -ls: Can not create a Path from an empty string Usage: hadoop fs [generic options] -ls [-d] [-h] [-R] [path ...] It works without *. -- This message was sent by Atlassian JIRA (v6.1#6144)
[jira] [Created] (HADOOP-10113) There are some threads which will be dead silently when uncaught exception/error occured
Kousuke Saruta created HADOOP-10113: --- Summary: There are some threads which will be dead silently when uncaught exception/error occured Key: HADOOP-10113 URL: https://issues.apache.org/jira/browse/HADOOP-10113 Project: Hadoop Common Issue Type: Bug Affects Versions: 3.0.0 Reporter: Kousuke Saruta Fix For: 3.0.0 Related to HDFS-5500, I found there are some threads be dead silently when uncaught exception/error occured. For example, following threads are I mentioned. * refreshUsed in DU * reloader in ReloadingX509TrustManager * t in UserGroupInformation#spawnAutoRenewalThreadForUserCreds * errThread in Shell#runCommand * sinkThread in MetricsSinkAdapter * blockScannerThread in DataBlockScanner * emptier in NameNode#startTrashEmptier (when we use TrashPolicyDefault) There are some critical threads if we can't notice the dead (e.g DU). I think we should handle those exception/error, and monitor the liveness or log that. -- This message was sent by Atlassian JIRA (v6.1#6144)
[jira] [Resolved] (HADOOP-9990) TestMetricsSystemImpl fails
[ https://issues.apache.org/jira/browse/HADOOP-9990?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Jonathan Eagles resolved HADOOP-9990. - Resolution: Duplicate TestMetricsSystemImpl fails --- Key: HADOOP-9990 URL: https://issues.apache.org/jira/browse/HADOOP-9990 Project: Hadoop Common Issue Type: Bug Components: metrics, test Affects Versions: 2.1.1-beta Environment: Ubuntu 13.04 Reporter: Tsuyoshi OZAWA Priority: Minor TestMetricsSystemImpl#testMultiThreadedPublish and testInitFirstVerifyStopInvokedImmediately fails occasionally. {code} Running org.apache.hadoop.metrics2.impl.TestMetricsSystemImpl Tests run: 6, Failures: 2, Errors: 0, Skipped: 0, Time elapsed: 2.787 sec FAILURE! - in org.apache.hadoop.metrics2.impl.TestMetricsSystemImpl testMultiThreadedPublish(org.apache.hadoop.metrics2.impl.TestMetricsSystemImpl) Time elapsed: 0.078 sec FAILURE! java.lang.AssertionError: expected:0 but was:2 at org.junit.Assert.fail(Assert.java:93) at org.junit.Assert.failNotEquals(Assert.java:647) at org.junit.Assert.assertEquals(Assert.java:128) at org.junit.Assert.assertEquals(Assert.java:472) at org.junit.Assert.assertEquals(Assert.java:456) at org.apache.hadoop.metrics2.impl.TestMetricsSystemImpl.testMultiThreadedPublish(TestMetricsSystemImpl.java:231) {code} {code} testInitFirstVerifyStopInvokedImmediately(org.apache.hadoop.metrics2.impl.TestMetricsSystemImpl) Time elapsed: 0.05 sec FAILURE! org.mockito.exceptions.base.MockitoAssertionError: Wanted at most 2 times but was 3 at org.apache.hadoop.metrics2.impl.TestMetricsSystemImpl.testInitFirstVerifyStopInvokedImmediately(TestMetricsSystemImpl.java:114) {code} -- This message was sent by Atlassian JIRA (v6.1#6144)
[jira] [Created] (HADOOP-10114) TestHdfsNativeCodeLoader should reside in hadoop-common instead of hadoop-hdfs
Haohui Mai created HADOOP-10114: --- Summary: TestHdfsNativeCodeLoader should reside in hadoop-common instead of hadoop-hdfs Key: HADOOP-10114 URL: https://issues.apache.org/jira/browse/HADOOP-10114 Project: Hadoop Common Issue Type: Bug Reporter: Haohui Mai Assignee: Haohui Mai TestHdfsNativeCodeLoader tests whether Java is able to load the native library libhadoop.so. However, it is the hadoop-common project, rather than the hadoop-hdfs project that creates this library during the build. Therefore this unit test will complain that it cannot find the library if Jenkins does not rebuild hadoop-common. HDFS-3987 is an example that hits this bug.The patch touches hadoop-auth and hadoop-hdfs. Jenkins decides it won't built hadoop-common, thus the unit test fails simply complaining about it cannot find libhadoop.so. The log of the build can be found at https://builds.apache.org/job/PreCommit-HDFS-Build/5470/consoleFull We can avoid this problem by moving the unit test to hadoop-common. -- This message was sent by Atlassian JIRA (v6.1#6144)
[jira] [Created] (HADOOP-10115) Exclude duplicate jars in hadoop package under different component's lib
Vinay created HADOOP-10115: -- Summary: Exclude duplicate jars in hadoop package under different component's lib Key: HADOOP-10115 URL: https://issues.apache.org/jira/browse/HADOOP-10115 Project: Hadoop Common Issue Type: Bug Affects Versions: 2.2.0, 3.0.0 Reporter: Vinay Assignee: Vinay In the hadoop package distribution there are more than 90% of the jars are duplicated in multiple places. For Ex: almost all jars in share/hadoop/hdfs/lib are already there in share/hadoop/common/lib Same case for all other lib in share directory. Anyway for all the daemon processes all directories are added to classpath. So to reduce the package distribution size and the classpath overhead, remove the duplicate jars from the distribution. -- This message was sent by Atlassian JIRA (v6.1#6144)