See <https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1259/changes>

Changes:

[szetszwo] HDFS-10390. Implement asynchronous setAcl/getAclStatus for

[naganarasimha_gr] YARN-5114. Add additional tests in TestRMWebServicesApps and 
rectify

------------------------------------------
[...truncated 9138 lines...]
        permissions: drwx
path 
'<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test'>:
 
        
absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test>
        permissions: drwx
path 
'<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target'>:
 
        
absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target>
        permissions: drwx
path 
'<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs'>:
 
        
absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs>
        permissions: drwx
path 
'<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project'>:
 
        
absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project>
        permissions: drwx
path '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source'>: 
        
absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source>
        permissions: drwx
path '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/'>: 
        absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/>
        permissions: drwx
path '/home/jenkins/jenkins-slave/workspace': 
        absolute:/home/jenkins/jenkins-slave/workspace
        permissions: drwx
path '/home/jenkins/jenkins-slave': 
        absolute:/home/jenkins/jenkins-slave
        permissions: drwx
path '/home/jenkins': 
        absolute:/home/jenkins
        permissions: drwx
path '/home': 
        absolute:/home
        permissions: dr-x
path '/': 
        absolute:/
        permissions: dr-x

        at 
org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:848)
        at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:490)
        at 
org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:449)
        at 
org.apache.hadoop.hdfs.TestAsyncDFSRename.internalTestConcurrentAsyncAPI(TestAsyncDFSRename.java:313)
        at 
org.apache.hadoop.hdfs.TestAsyncDFSRename.testConservativeConcurrentAsyncAPI(TestAsyncDFSRename.java:284)

testAsyncAPIWithException(org.apache.hadoop.hdfs.TestAsyncDFSRename)  Time 
elapsed: 0.852 sec  <<< ERROR!
java.io.IOException: Cannot remove data directory: 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs/datapath>
 
'<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs/data'>:
 
        
absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs/data>
        permissions: drwx
path 
'<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs'>:
 
        
absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2/dfs>
        permissions: drwx
path 
'<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2'>:
 
        
absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data/2>
        permissions: drwx
path 
'<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data'>:
 
        
absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test/data>
        permissions: drwx
path 
'<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test'>:
 
        
absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/test>
        permissions: drwx
path 
'<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target'>:
 
        
absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target>
        permissions: drwx
path 
'<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs'>:
 
        
absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs>
        permissions: drwx
path 
'<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project'>:
 
        
absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project>
        permissions: drwx
path '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source'>: 
        
absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source>
        permissions: drwx
path '<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/'>: 
        absolute:<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/>
        permissions: drwx
path '/home/jenkins/jenkins-slave/workspace': 
        absolute:/home/jenkins/jenkins-slave/workspace
        permissions: drwx
path '/home/jenkins/jenkins-slave': 
        absolute:/home/jenkins/jenkins-slave
        permissions: drwx
path '/home/jenkins': 
        absolute:/home/jenkins
        permissions: drwx
path '/home': 
        absolute:/home
        permissions: dr-x
path '/': 
        absolute:/
        permissions: dr-x

        at 
org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:848)
        at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:490)
        at 
org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:449)
        at 
org.apache.hadoop.hdfs.TestAsyncDFSRename.testAsyncAPIWithException(TestAsyncDFSRename.java:501)

testAggressiveConcurrentAsyncRenameWithOverwrite(org.apache.hadoop.hdfs.TestAsyncDFSRename)
  Time elapsed: 60.004 sec  <<< ERROR!
java.lang.Exception: test timed out after 60000 milliseconds
        at java.security.AccessController.getStackAccessControlContext(Native 
Method)
        at java.security.AccessController.getContext(AccessController.java:820)
        at java.lang.Thread.init(Thread.java:412)
        at java.lang.Thread.init(Thread.java:349)
        at java.lang.Thread.<init>(Thread.java:445)
        at org.apache.hadoop.util.Daemon.<init>(Daemon.java:52)
        at org.apache.hadoop.hdfs.DataStreamer.<init>(DataStreamer.java:427)
        at org.apache.hadoop.hdfs.DataStreamer.<init>(DataStreamer.java:454)
        at 
org.apache.hadoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:228)
        at 
org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:286)
        at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1183)
        at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1125)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:435)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$8.doCall(DistributedFileSystem.java:432)
        at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:446)
        at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:976)
        at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:394)
        at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:375)
        at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:368)
        at org.apache.hadoop.hdfs.DFSTestUtil.createFile(DFSTestUtil.java:361)
        at 
org.apache.hadoop.hdfs.TestAsyncDFSRename.internalTestConcurrentAsyncRenameWithOverwrite(TestAsyncDFSRename.java:225)
        at 
org.apache.hadoop.hdfs.TestAsyncDFSRename.testAggressiveConcurrentAsyncRenameWithOverwrite(TestAsyncDFSRename.java:199)

"IPC Server handler 6 on 37571" daemon prio=5 tid=163 tim
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.046 sec - in 
org.apache.hadoop.hdfs.TestDataTransferProtocol
Running org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.281 sec - in 
org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.hdfs.TestKeyProviderCache
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.533 sec - in 
org.apache.hadoop.hdfs.TestKeyProviderCache
Running org.apache.hadoop.net.TestNetworkTopology
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.066 sec - 
in org.apache.hadoop.net.TestNetworkTopology
Running org.apache.hadoop.tracing.TestTraceAdmin
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.79 sec - in 
org.apache.hadoop.tracing.TestTraceAdmin
Running org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 63.816 sec - in 
org.apache.hadoop.hdfs.TestDecommissionWithStriped
Running org.apache.hadoop.tracing.TestTracing
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.369 sec - in 
org.apache.hadoop.tracing.TestTracingShortCircuitLocalRead
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.806 sec - in 
org.apache.hadoop.tracing.TestTracing
Running org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 361.121 sec - 
in org.apache.hadoop.hdfs.qjournal.client.TestQJMWithFaults
Running org.apache.hadoop.TestRefreshCallQueue
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.749 sec - in 
org.apache.hadoop.TestRefreshCallQueue
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 40.344 sec - in 
org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 55.613 sec - 
in org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithHdfs
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 125.568 sec - 
in org.apache.hadoop.hdfs.TestDFSStripedOutputStreamWithFailure

Results :

Failed tests: 
  TestNamenodeCapacityReport.testXceiverCount:280->checkClusterHealth:320 
expected:<14.0> but was:<15.0>
  TestStandbyCheckpoints.shutdownCluster:141 Test resulted in an unexpected exit

Tests in error: 
  TestStandbyCheckpoints.testStandbyExceptionThrownDuringCheckpoint:373 »  test 
...
  
TestAsyncDFSRename.testAggressiveConcurrentAsyncAPI:289->internalTestConcurrentAsyncAPI:328
 » 
  
TestAsyncDFSRename.testConservativeConcurrentAsyncAPI:284->internalTestConcurrentAsyncAPI:313
 » IO
  TestAsyncDFSRename.testAsyncAPIWithException:501 » IO Cannot remove data 
direc...
  
TestAsyncDFSRename.testAggressiveConcurrentAsyncRenameWithOverwrite:199->internalTestConcurrentAsyncRenameWithOverwrite:225
 » 

Tests run: 4431, Failures: 2, Errors: 5, Skipped: 17

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS Native Client
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-alpha1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-hdfs-project 
---
[INFO] Deleting 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project 
---
[INFO] Executing tasks

main:
    [mkdir] Created dir: 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ 
hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable 
package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project 
---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ 
hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [04:19 min]
[INFO] Apache Hadoop HDFS ................................. FAILURE [  01:17 h]
[INFO] Apache Hadoop HDFS Native Client ................... SKIPPED
[INFO] Apache Hadoop HttpFS ............................... SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................. SKIPPED
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.079 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:21 h
[INFO] Finished at: 2016-05-25T00:06:57+00:00
[INFO] Final Memory: 100M/4332M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on 
project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to 
<https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/ws/source/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports>
 for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Setting 
LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting 
MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Recording test results
Setting 
LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting 
MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3
Setting 
LATEST1_8_HOME=/home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8
Setting 
MAVEN_3_3_3_HOME=/home/jenkins/jenkins-slave/tools/hudson.tasks.Maven_MavenInstallation/maven-3.3.3

---------------------------------------------------------------------
To unsubscribe, e-mail: hdfs-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: hdfs-dev-h...@hadoop.apache.org

Reply via email to