Build failed in Jenkins: Hadoop-Common-0.23-Build #504
See https://builds.apache.org/job/Hadoop-Common-0.23-Build/504/changes Changes: [jlowe] svn merge -c 1437775 FIXES: YARN-354. WebAppProxyServer exits immediately after startup. Contributed by Liang Xie [suresh] HDFS-4426. Merge change 1437627 from trunk. -- [...truncated 20688 lines...] Running org.apache.hadoop.io.file.tfile.TestTFileComparator2 Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.46 sec Running org.apache.hadoop.io.file.tfile.TestTFile Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.426 sec Running org.apache.hadoop.io.file.tfile.TestTFileSplit Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.313 sec Running org.apache.hadoop.io.file.tfile.TestTFileSeqFileComparison Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.01 sec Running org.apache.hadoop.io.file.tfile.TestTFileNoneCodecsStreams Tests run: 19, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.878 sec Running org.apache.hadoop.io.file.tfile.TestTFileComparators Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.523 sec Running org.apache.hadoop.io.TestMapFile Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.611 sec Running org.apache.hadoop.io.TestBytesWritable Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.092 sec Running org.apache.hadoop.io.TestArrayFile Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.254 sec Running org.apache.hadoop.io.TestVersionedWritable Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.063 sec Running org.apache.hadoop.io.TestArrayWritable Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.064 sec Running org.apache.hadoop.io.nativeio.TestNativeIO Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.856 sec Running org.apache.hadoop.io.TestWritableUtils Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.119 sec Running org.apache.hadoop.io.TestIOUtils Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.284 sec Running org.apache.hadoop.io.serializer.avro.TestAvroSerialization Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.538 sec Running org.apache.hadoop.io.serializer.TestWritableSerialization Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.308 sec Running org.apache.hadoop.io.serializer.TestSerializationFactory Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.271 sec Running org.apache.hadoop.test.TestMultithreadedTestUtil Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.157 sec Running org.apache.hadoop.test.TestTimedOutTestsListener Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.112 sec Running org.apache.hadoop.net.TestSocketIOWithTimeout Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.206 sec Running org.apache.hadoop.net.TestSwitchMapping Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.293 sec Running org.apache.hadoop.net.TestNetUtils Tests run: 36, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.493 sec Running org.apache.hadoop.net.TestStaticMapping Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.454 sec Running org.apache.hadoop.net.TestScriptBasedMapping Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.913 sec Running org.apache.hadoop.net.TestDNS Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.152 sec Results : Tests run: 1598, Failures: 0, Errors: 0, Skipped: 4 [INFO] [INFO] --- maven-clover2-plugin:3.0.5:clover (clover) @ hadoop-common --- [INFO] Using /default-clover-report descriptor. [INFO] Using Clover report descriptor: /tmp/mvn6129154145065043139resource [INFO] Clover Version 3.0.2, built on April 13 2010 (build-790) [INFO] Loaded from: /home/jenkins/.m2/repository/com/cenqua/clover/clover/3.0.2/clover-3.0.2.jar [INFO] Clover: Open Source License registered to Apache. [INFO] Clover is enabled with initstring 'https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-common/target/clover/hadoop-coverage.db' [WARNING] Clover historical directory [https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-common/target/clover/history] does not exist, skipping Clover historical report generation ([https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-common/target/clover]) [INFO] Clover Version 3.0.2, built on April 13 2010 (build-790) [INFO] Loaded from: /home/jenkins/.m2/repository/com/cenqua/clover/clover/3.0.2/clover-3.0.2.jar [INFO] Clover: Open Source License registered to Apache. [INFO] Loading coverage database from: 'https://builds.apache.org/job/Hadoop-Common-0.23-Build/ws/trunk/hadoop-common-project/hadoop-common/target/clover/hadoop-coverage.db' [WARNING] Failed to load coverage recording
Build failed in Jenkins: Hadoop-Common-trunk #663
See https://builds.apache.org/job/Hadoop-Common-trunk/663/changes Changes: [szetszwo] Add .classpath, .project, .settings and target to svn:ignore. [jlowe] YARN-354. WebAppProxyServer exits immediately after startup. Contributed by Liang Xie [suresh] HDFS-4426. Secondary namenode shuts down immediately after startup. Contributed by Arpit Agarwal. [tomwhite] YARN-319. Submitting a job to a fair scheduler queue for which the user does not have permission causes the client to wait forever. Contributed by shenhong. -- [...truncated 31041 lines...] [exec] [exec] unpack-plugin: [exec] [exec] install-plugin: [exec] [exec] configure-plugin: [exec] [exec] configure-output-plugin: [exec] Mounting output plugin: org.apache.forrest.plugin.output.pdf [exec] Processing https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/target/docs-src/build/tmp/output.xmap to https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/target/docs-src/build/tmp/output.xmap.new [exec] Loading stylesheet /home/jenkins/tools/forrest/latest/main/var/pluginMountSnippet.xsl [exec] Moving 1 file to https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/target/docs-src/build/tmp [exec] [exec] configure-plugin-locationmap: [exec] Mounting plugin locationmap for org.apache.forrest.plugin.output.pdf [exec] Processing https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/target/docs-src/build/tmp/locationmap.xml to https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/target/docs-src/build/tmp/locationmap.xml.new [exec] Loading stylesheet /home/jenkins/tools/forrest/latest/main/var/pluginLmMountSnippet.xsl [exec] Moving 1 file to https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/target/docs-src/build/tmp [exec] [exec] init: [exec] [exec] -prepare-classpath: [exec] [exec] check-contentdir: [exec] [exec] examine-proj: [exec] [exec] validation-props: [exec] Using these catalog descriptors: /home/jenkins/tools/forrest/latest/main/webapp/resources/schema/catalog.xcat:/home/jenkins/tools/forrest/latest/build/plugins/catalog.xcat:https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/target/docs-src/src/documentation/resources/schema/catalog.xcat [exec] [exec] validate-xdocs: [exec] 7 file(s) have been successfully validated. [exec] ...validated xdocs [exec] [exec] validate-skinconf: [exec] Warning: https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/target/docs-src/build/webapp/resources not found. [exec] 1 file(s) have been successfully validated. [exec] ...validated skinconf [exec] [exec] validate-sitemap: [exec] [exec] validate-skins-stylesheets: [exec] [exec] validate-skins: [exec] [exec] validate-skinchoice: [exec] ...validated existence of skin 'pelt' [exec] [exec] validate-stylesheets: [exec] [exec] validate: [exec] [exec] site: [exec] [exec] Copying the various non-generated resources to site. [exec] Warnings will be issued if the optional project resources are not found. [exec] This is often the case, because they are optional and so may not be available. [exec] Copying project resources and images to site ... [exec] Copied 1 empty directory to 1 empty directory under https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/target/docs-src/build/site [exec] Copying main skin images to site ... [exec] Created dir: https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/target/docs-src/build/site/skin/images [exec] Copying 20 files to https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/target/docs-src/build/site/skin/images [exec] Warning: https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/target/docs-src/src/documentation/skins/common/images not found. [exec] Warning: https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/target/docs-src/src/documentation/skins/pelt/images not found. [exec] Warning: https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/target/docs-src/src/documentation/skins/common not found. [exec] Warning: https://builds.apache.org/job/Hadoop-Common-trunk/ws/trunk/hadoop-common-project/hadoop-common/target/docs-src/src/documentation/skins/pelt not
[jira] [Created] (HADOOP-9240) Making ArrayWritable typed
David Parks created HADOOP-9240: --- Summary: Making ArrayWritable typed Key: HADOOP-9240 URL: https://issues.apache.org/jira/browse/HADOOP-9240 Project: Hadoop Common Issue Type: Improvement Components: io Reporter: David Parks Priority: Minor ArrayWritable is just painful to use in practice, it would be nice if we had a typed version of ArrayWritable that had all the features of an ArrayList. It wasn't hard to write, and it doesn't cost more in terms of storage or CPU than an ArrayWritable. So I wonder why not include a more usable ArrayListWritable class with Hadoop? Code pasted below, unless there's a reason that this is a bad idea I'm happy to provide the unit test for it as well. @SuppressWarnings(serial) public class ArrayListWritableE extends Writable extends ArrayListE implements Writable { ClassE type = null; public ArrayListWritable(ClassE type){ super(); this.type = type; } public ArrayListWritable(){ super(); } public void setArrayClassType(ClassE clazz){ this.type = clazz; } @Override public void write(DataOutput out) throws IOException { if(type==null){ throw new IOException(Cannot write an + getClass().getName() + without the class type being set in the constructor or with setArrayClassType(...)); }; out.writeUTF(type.getCanonicalName()); out.writeInt(size()); for(E writable : this) writable.write(out); } @SuppressWarnings(unchecked) @Override public void readFields(DataInput in) throws IOException { clear(); //Read the class name try { type = (ClassE)Class.forName(in.readUTF()); } catch (ClassNotFoundException e) { throw new IOException(Invalid connonical name read from input bytes, e); } //Read the size set capacity of the arraylist int size = in.readInt(); ensureCapacity(size); //Read in individual writables for (int i = 0; i size; i++) { Writable value = WritableFactories.newInstance(type); value.readFields(in); // read a value add((E)value); // add it to the ArrayList } } } -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Created] (HADOOP-9241) DU refresh interval is not configurable
Harsh J created HADOOP-9241: --- Summary: DU refresh interval is not configurable Key: HADOOP-9241 URL: https://issues.apache.org/jira/browse/HADOOP-9241 Project: Hadoop Common Issue Type: Improvement Affects Versions: 2.0.2-alpha Reporter: Harsh J Priority: Trivial While the {{DF}} class's refresh interval is configurable, the {{DU}}'s isn't. We should ensure both be configurable. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Reopened] (HADOOP-9217) Print thread dumps when hadoop-common tests fail
[ https://issues.apache.org/jira/browse/HADOOP-9217?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Andrey Klochkov reopened HADOOP-9217: - Print thread dumps when hadoop-common tests fail Key: HADOOP-9217 URL: https://issues.apache.org/jira/browse/HADOOP-9217 Project: Hadoop Common Issue Type: Test Components: test Affects Versions: 2.0.2-alpha, 0.23.5 Reporter: Andrey Klochkov Assignee: Andrey Klochkov Fix For: 2.0.3-alpha, 0.23.6 Attachments: HADOOP-9217-fix1.patch, HADOOP-9217.patch Printing thread dumps when tests fail due to timeouts was introduced in HADOOP-8755, but was enabled in M/R, HDFS and Yarn only. It makes sense to enable in hadoop-common as well. In particular, TestZKFailoverController seems to be one of the most flaky tests in trunk currently and having thread dumps may help debugging this. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Created] (HADOOP-9242) Duplicate surefire plugin config in hadoop-common
Andrey Klochkov created HADOOP-9242: --- Summary: Duplicate surefire plugin config in hadoop-common Key: HADOOP-9242 URL: https://issues.apache.org/jira/browse/HADOOP-9242 Project: Hadoop Common Issue Type: Bug Components: test Affects Versions: 2.0.3-alpha, 0.23.6 Reporter: Andrey Klochkov Assignee: Andrey Klochkov Unfortunately in HADOOP-9217 a duplicated configuration of Surefire plugin was introduced in hadoop-common/pom.xml, effectively discarding a part of configuration. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
.m2 repo messed up on hadoop6
A few pre-commit builds have been failing recently with compile errors which I think are due to a bad jar in the /home/jenkins/.m2 repo on hadoop6. For example, both of these builds: https://builds.apache.org/view/G-L/view/Hadoop/job/PreCommit-HDFS-Build/3878/ https://builds.apache.org/view/G-L/view/Hadoop/job/PreCommit-HDFS-Build/3879/ Failed with this error: [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:2.5.1:compile (default-compile) on project hadoop-yarn-api: Compilation failure: Compilation failure: [ERROR] error: error reading /home/jenkins/.m2/repository/org/glassfish/grizzly/grizzly-framework/2.1.1/grizzly-framework-2.1.1.jar; error in opening zip file [ERROR] error: error reading /home/jenkins/.m2/repository/org/glassfish/grizzly/grizzly-rcm/2.1.1/grizzly-rcm-2.1.1.jar; error in opening zip file [ERROR] error: error reading /home/jenkins/.m2/repository/org/glassfish/grizzly/grizzly-framework/2.1.1/grizzly-framework-2.1.1-tests.jar; error in opening zip file Could someone with access to the build slaves please clear out /home/jenkins/.m2 on hadoop6? Alternatively, could I be given access to the build slave machines so I can fix issues like this in the future myself? Thanks a lot. -- Aaron T. Myers Software Engineer, Cloudera
[jira] [Created] (HADOOP-9243) Some improvements to the mailing lists webpage for lowering unrelated content rate
Harsh J created HADOOP-9243: --- Summary: Some improvements to the mailing lists webpage for lowering unrelated content rate Key: HADOOP-9243 URL: https://issues.apache.org/jira/browse/HADOOP-9243 Project: Hadoop Common Issue Type: Improvement Components: documentation Reporter: Harsh J Priority: Minor From Steve on HADOOP-9329: {quote} * could you add a bit of text to say user@ is not the place to discuss installation problems related to any third party products that install some variant of Hadoop on people's desktops and servers. You're the one who ends up having to bounce off all the CDH-related queries -it would help you too. * For the new Invalid JIRA link to paste into JIRA issues about this, I point to the distributions and Commercial support page on the wiki -something similar on the mailing lists page would avoid having to put any specific vendor links into the mailing lists page, and support a higher/more open update process. See http://wiki.apache.org/hadoop/InvalidJiraIssues {code} -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Created] (HADOOP-9245) Post HADOOP-8924, running mvn clean (without running mvn install before) fails
Karthik Kambatla created HADOOP-9245: Summary: Post HADOOP-8924, running mvn clean (without running mvn install before) fails Key: HADOOP-9245 URL: https://issues.apache.org/jira/browse/HADOOP-9245 Project: Hadoop Common Issue Type: Bug Components: build Affects Versions: 2.0.3-alpha Reporter: Karthik Kambatla Assignee: Karthik Kambatla HADOOP-8924 introduces plugin dependency on hadoop-maven-plugins in hadoop-common and hadoop-yarn-common. Calling mvn clean on a fresh m2/repository (missing hadoop-maven-plugins) fails due to this dependency. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
Re: .m2 repo messed up on hadoop6
I just cleaned the ~/.m2 cache on hadoop6 -Giri On Thu, Jan 24, 2013 at 1:17 PM, Aaron T. Myers a...@cloudera.com wrote: A few pre-commit builds have been failing recently with compile errors which I think are due to a bad jar in the /home/jenkins/.m2 repo on hadoop6. For example, both of these builds: https://builds.apache.org/view/G-L/view/Hadoop/job/PreCommit-HDFS-Build/3878/ https://builds.apache.org/view/G-L/view/Hadoop/job/PreCommit-HDFS-Build/3879/ Failed with this error: [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:2.5.1:compile (default-compile) on project hadoop-yarn-api: Compilation failure: Compilation failure: [ERROR] error: error reading /home/jenkins/.m2/repository/org/glassfish/grizzly/grizzly-framework/2.1.1/grizzly-framework-2.1.1.jar; error in opening zip file [ERROR] error: error reading /home/jenkins/.m2/repository/org/glassfish/grizzly/grizzly-rcm/2.1.1/grizzly-rcm-2.1.1.jar; error in opening zip file [ERROR] error: error reading /home/jenkins/.m2/repository/org/glassfish/grizzly/grizzly-framework/2.1.1/grizzly-framework-2.1.1-tests.jar; error in opening zip file Could someone with access to the build slaves please clear out /home/jenkins/.m2 on hadoop6? Alternatively, could I be given access to the build slave machines so I can fix issues like this in the future myself? Thanks a lot. -- Aaron T. Myers Software Engineer, Cloudera
[jira] [Resolved] (HADOOP-9245) mvn clean without running mvn install before fails
[ https://issues.apache.org/jira/browse/HADOOP-9245?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Suresh Srinivas resolved HADOOP-9245. - Resolution: Fixed Fix Version/s: 3.0.0 Hadoop Flags: Reviewed I committed the patch to trunk and branch-trunk-win. Thank you Karthik! mvn clean without running mvn install before fails -- Key: HADOOP-9245 URL: https://issues.apache.org/jira/browse/HADOOP-9245 Project: Hadoop Common Issue Type: Bug Components: build Affects Versions: 3.0.0, trunk-win Reporter: Karthik Kambatla Assignee: Karthik Kambatla Fix For: 3.0.0 Attachments: HADOOP-9245.patch HADOOP-8924 introduces plugin dependency on hadoop-maven-plugins in hadoop-common and hadoop-yarn-common. Calling mvn clean on a fresh m2/repository (missing hadoop-maven-plugins) fails due to this dependency. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira
[jira] [Created] (HADOOP-9246) Execution phase for hadoop-maven-plugin should be prepare-resources
Karthik Kambatla created HADOOP-9246: Summary: Execution phase for hadoop-maven-plugin should be prepare-resources Key: HADOOP-9246 URL: https://issues.apache.org/jira/browse/HADOOP-9246 Project: Hadoop Common Issue Type: Bug Components: build Affects Versions: 3.0.0, trunk-win Reporter: Karthik Kambatla Assignee: Karthik Kambatla -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators For more information on JIRA, see: http://www.atlassian.com/software/jira