See https://builds.apache.org/job/Hadoop-Hdfs-trunk/2939/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE 
###########################
[...truncated 5407 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project 
---
[INFO] Executing tasks

main:
    [mkdir] Created dir: 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ 
hadoop-hdfs-project ---
[INFO] Skipping javadoc generation
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project 
---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ 
hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ 
hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:49 min]
[INFO] Apache Hadoop HDFS ................................ FAILURE [  04:35 h]
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.081 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 04:40 h
[INFO] Finished at: 2016-03-18T02:36:50+00:00
[INFO] Final Memory: 57M/713M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on 
project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to 
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports
 for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) 
##############################
3 tests failed.
FAILED:  
org.apache.hadoop.hdfs.TestErasureCodeBenchmarkThroughput.testECReadWrite

Error Message:
java.io.IOException: Data streamers failed while creating new block streams: 
[#5: failed, blk_-9223372036854775659_1016, #4: failed, 
blk_-9223372036854775660_1016, #0: failed, blk_-9223372036854775664_1016, #1: 
failed, blk_-9223372036854775663_1016, #2: failed, 
blk_-9223372036854775662_1016, #6: failed, blk_-9223372036854775658_1016]. 
There are not enough healthy streamers.

Stack Trace:
java.util.concurrent.ExecutionException: java.io.IOException: Data streamers 
failed while creating new block streams: [#5: failed, 
blk_-9223372036854775659_1016, #4: failed, blk_-9223372036854775660_1016, #0: 
failed, blk_-9223372036854775664_1016, #1: failed, 
blk_-9223372036854775663_1016, #2: failed, blk_-9223372036854775662_1016, #6: 
failed, blk_-9223372036854775658_1016]. There are not enough healthy streamers.
        at java.util.concurrent.FutureTask.report(FutureTask.java:122)
        at java.util.concurrent.FutureTask.get(FutureTask.java:188)
        at 
org.apache.hadoop.hdfs.ErasureCodeBenchmarkThroughput.doBenchmark(ErasureCodeBenchmarkThroughput.java:136)
        at 
org.apache.hadoop.hdfs.ErasureCodeBenchmarkThroughput.benchmark(ErasureCodeBenchmarkThroughput.java:165)
        at 
org.apache.hadoop.hdfs.ErasureCodeBenchmarkThroughput.run(ErasureCodeBenchmarkThroughput.java:261)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
        at 
org.apache.hadoop.hdfs.TestErasureCodeBenchmarkThroughput.runBenchmark(TestErasureCodeBenchmarkThroughput.java:66)
        at 
org.apache.hadoop.hdfs.TestErasureCodeBenchmarkThroughput.testECReadWrite(TestErasureCodeBenchmarkThroughput.java:105)
Caused by: java.io.IOException: Data streamers failed while creating new block 
streams: [#5: failed, blk_-9223372036854775659_1016, #4: failed, 
blk_-9223372036854775660_1016, #0: failed, blk_-9223372036854775664_1016, #1: 
failed, blk_-9223372036854775663_1016, #2: failed, 
blk_-9223372036854775662_1016, #6: failed, blk_-9223372036854775658_1016]. 
There are not enough healthy streamers.
        at 
org.apache.hadoop.hdfs.DFSStripedOutputStream.checkStreamerFailures(DFSStripedOutputStream.java:631)
        at 
org.apache.hadoop.hdfs.DFSStripedOutputStream.writeChunk(DFSStripedOutputStream.java:547)
        at 
org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunks(FSOutputSummer.java:217)
        at org.apache.hadoop.fs.FSOutputSummer.write1(FSOutputSummer.java:125)
        at org.apache.hadoop.fs.FSOutputSummer.write(FSOutputSummer.java:111)
        at 
org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:57)
        at java.io.DataOutputStream.write(DataOutputStream.java:107)
        at 
org.apache.hadoop.hdfs.ErasureCodeBenchmarkThroughput$WriteCallable.writeFile(ErasureCodeBenchmarkThroughput.java:321)
        at 
org.apache.hadoop.hdfs.ErasureCodeBenchmarkThroughput$WriteCallable.call(ErasureCodeBenchmarkThroughput.java:345)
        at 
org.apache.hadoop.hdfs.ErasureCodeBenchmarkThroughput$WriteCallable.call(ErasureCodeBenchmarkThroughput.java:300)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)


FAILED:  
org.apache.hadoop.hdfs.server.balancer.TestBalancerWithMultipleNameNodes.testBalancing2OutOf3Blockpools

Error Message:
Creating block, no free space available

Stack Trace:
java.io.IOException: Creating block, no free space available
        at 
org.apache.hadoop.hdfs.server.datanode.SimulatedFSDataset$BInfo.<init>(SimulatedFSDataset.java:147)
        at 
org.apache.hadoop.hdfs.server.datanode.SimulatedFSDataset.injectBlocks(SimulatedFSDataset.java:575)
        at 
org.apache.hadoop.hdfs.MiniDFSCluster.injectBlocks(MiniDFSCluster.java:2662)
        at 
org.apache.hadoop.hdfs.server.balancer.TestBalancerWithMultipleNameNodes.unevenDistribution(TestBalancerWithMultipleNameNodes.java:405)
        at 
org.apache.hadoop.hdfs.server.balancer.TestBalancerWithMultipleNameNodes.testBalancing2OutOf3Blockpools(TestBalancerWithMultipleNameNodes.java:516)


FAILED:  
org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs.testMissingPropertiesWithSecureHDFS

Error Message:
Failed on local exception: java.io.IOException: 
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: 
No valid credentials provided (Mechanism level: Connection reset)]; Host 
Details : local host is: "asf909.gq1.ygridcore.net/67.195.81.153"; destination 
host is: "localhost":37362; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: 
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: 
No valid credentials provided (Mechanism level: Connection reset)]; Host 
Details : local host is: "asf909.gq1.ygridcore.net/67.195.81.153"; destination 
host is: "localhost":37362; 
        at java.net.SocketInputStream.read(SocketInputStream.java:196)
        at java.net.SocketInputStream.read(SocketInputStream.java:122)
        at java.io.BufferedInputStream.fill(BufferedInputStream.java:235)
        at java.io.BufferedInputStream.read1(BufferedInputStream.java:275)
        at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
        at sun.security.krb5.internal.TCPClient.readFully(NetClient.java:132)
        at sun.security.krb5.internal.TCPClient.receive(NetClient.java:84)
        at sun.security.krb5.KdcComm$KdcCommunication.run(KdcComm.java:390)
        at sun.security.krb5.KdcComm$KdcCommunication.run(KdcComm.java:343)
        at java.security.AccessController.doPrivileged(Native Method)
        at sun.security.krb5.KdcComm.send(KdcComm.java:327)
        at sun.security.krb5.KdcComm.send(KdcComm.java:219)
        at sun.security.krb5.KdcComm.send(KdcComm.java:191)
        at sun.security.krb5.KrbTgsReq.send(KrbTgsReq.java:187)
        at sun.security.krb5.KrbTgsReq.sendAndGetCreds(KrbTgsReq.java:202)
        at 
sun.security.krb5.internal.CredentialsUtil.serviceCreds(CredentialsUtil.java:311)
        at 
sun.security.krb5.internal.CredentialsUtil.acquireServiceCreds(CredentialsUtil.java:115)
        at 
sun.security.krb5.Credentials.acquireServiceCreds(Credentials.java:449)
        at 
sun.security.jgss.krb5.Krb5Context.initSecContext(Krb5Context.java:641)
        at 
sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:248)
        at 
sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
        at 
com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193)
        at 
org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:411)
        at 
org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:565)
        at org.apache.hadoop.ipc.Client$Connection.access$1900(Client.java:378)
        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:750)
        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:746)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
        at 
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:745)
        at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:378)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1413)
        at org.apache.hadoop.ipc.Client.call(Client.java:1328)
        at org.apache.hadoop.ipc.Client.call(Client.java:1306)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
        at com.sun.proxy.$Proxy25.mkdirs(Unknown Source)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:536)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:257)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
        at com.sun.proxy.$Proxy26.mkdirs(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2295)
        at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2270)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1079)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$24.doCall(DistributedFileSystem.java:1076)
        at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:1076)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:1069)
        at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1909)
        at 
org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs.createDirectoriesSecurely(TestRollingFileSystemSinkWithSecureHdfs.java:206)
        at 
org.apache.hadoop.metrics2.sink.TestRollingFileSystemSinkWithSecureHdfs.testMissingPropertiesWithSecureHDFS(TestRollingFileSystemSinkWithSecureHdfs.java:146)


Reply via email to