See https://builds.apache.org/job/Hadoop-Mapreduce-trunk/2382/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE 
###########################
[...truncated 33566 lines...]
  TestDatamerge$1>TestSetup.run:27->tearDown:69 » NoClassDefFound 
org/apache/had...
  TestMRTimelineEventHandling.testMRTimelineEventHandling:131 » YarnRuntime 
java...
  TestClusterMRNotification>NotificationTestCase.testMR:178 » IO Job cleanup 
did...
  
TestMRIntermediateDataEncryption.testSingleReducer:55->doEncryptionTest:75->doEncryptionTest:95->runMergeTest:161->verifyOutput:176
 » FileNotFound
  TestNonExistentJob.setUp:73 » YarnRuntime java.io.IOException: 
ResourceManager...
  TestMRAMWithNonNormalizedCapabilities.setup:72 » NoClassDefFound 
org/apache/ha...
  TestMRAMWithNonNormalizedCapabilities.tearDown:118 » NoClassDefFound 
org/apach...

Tests run: 344, Failures: 1, Errors: 11, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.838 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 29.626 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.676 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [09:19 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [05:39 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:09 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:27 h
[INFO] Finished at: 2015-09-25T00:23:01+00:00
[INFO] Final Memory: 42M/797M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on 
project hadoop-mapreduce-client-jobclient: ExecutionException: 
java.lang.RuntimeException: The forked VM terminated without properly saying 
goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd 
/home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient
 && /home/jenkins/tools/java/jdk1.7.0_55/jre/bin/java -Xmx4096m 
-XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar 
/home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter4879566994901274104.jar
 
/home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire271960283039644769tmp
 
/home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_1643745166749477679146tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Sending artifact delta relative to Hadoop-Mapreduce-trunk #2330
Archived 1 artifacts
Archive block size is 32768
Received 0 blocks and 20471662 bytes
Compression is 0.0%
Took 5.9 sec
Recording test results
Updating HADOOP-12437
Updating HADOOP-8436
Updating YARN-3624
Updating HADOOP-12252
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) 
##############################
12 tests failed.
REGRESSION:  
org.apache.hadoop.mapred.TestMiniMRWithDFSWithDistinctUsers.testMultipleSpills

Error Message:
java.lang.NoClassDefFoundError: 
org/apache/hadoop/hdfs/protocol/proto/ClientNamenodeProtocolProtos$GetListingRequestProto$Builder

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: 
java.lang.NoClassDefFoundError: 
org/apache/hadoop/hdfs/protocol/proto/ClientNamenodeProtocolProtos$GetListingRequestProto$Builder
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetListingRequestProto.newBuilder(ClientNamenodeProtocolProtos.java:29094)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:569)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:251)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
        at com.sun.proxy.$Proxy22.getListing(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1648)
        at org.apache.hadoop.fs.Hdfs$DirListingIterator.<init>(Hdfs.java:211)
        at org.apache.hadoop.fs.Hdfs$DirListingIterator.<init>(Hdfs.java:198)
        at org.apache.hadoop.fs.Hdfs$2.<init>(Hdfs.java:180)
        at org.apache.hadoop.fs.Hdfs.listStatusIterator(Hdfs.java:180)
        at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1489)
        at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1484)
        at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
        at org.apache.hadoop.fs.FileContext.listStatus(FileContext.java:1491)
        at 
org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:456)
        at 
org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:444)
        at 
org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:439)
        at 
org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.findTimestampedDirectories(HistoryFileManager.java:778)
        at 
org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.initExisting(HistoryFileManager.java:672)
        at 
org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceInit(JobHistory.java:97)
        at 
org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
        at 
org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
        at 
org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceInit(JobHistoryServer.java:151)
        at 
org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
        at 
org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStart(MiniMRYarnCluster.java:210)
        at 
org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
        at 
org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
        at 
org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
        at 
org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
        at 
org.apache.hadoop.mapred.TestMiniMRWithDFSWithDistinctUsers.setUp(TestMiniMRWithDFSWithDistinctUsers.java:97)


REGRESSION:  
org.apache.hadoop.mapred.TestMiniMRWithDFSWithDistinctUsers.testDistinctUsers

Error Message:
null

Stack Trace:
java.lang.IllegalStateException: null
        at 
com.google.common.base.Preconditions.checkState(Preconditions.java:129)
        at org.apache.hadoop.ipc.Client.setCallIdAndRetryCount(Client.java:123)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:100)
        at com.sun.proxy.$Proxy22.getDatanodeReport(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.datanodeReport(DFSClient.java:2124)
        at 
org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2385)
        at 
org.apache.hadoop.hdfs.MiniDFSCluster.waitActive(MiniDFSCluster.java:2428)
        at 
org.apache.hadoop.hdfs.MiniDFSCluster.startDataNodes(MiniDFSCluster.java:1607)
        at 
org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:840)
        at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:478)
        at 
org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:437)
        at 
org.apache.hadoop.mapred.TestMiniMRWithDFSWithDistinctUsers.setUp(TestMiniMRWithDFSWithDistinctUsers.java:78)


FAILED:  
org.apache.hadoop.mapred.TestReduceFetchFromPartialMem$1.org.apache.hadoop.mapred.TestReduceFetchFromPartialMem

Error Message:
org/apache/hadoop/yarn/event/EventHandler

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/event/EventHandler
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at 
org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
        at 
org.apache.hadoop.mapred.TestReduceFetchFromPartialMem$1.setUp(TestReduceFetchFromPartialMem.java:61)


FAILED:  
org.apache.hadoop.mapred.join.TestDatamerge$1.org.apache.hadoop.mapred.join.TestDatamerge

Error Message:
org/apache/hadoop/hdfs/server/namenode/JournalSet$5

Stack Trace:
java.lang.NoClassDefFoundError: 
org/apache/hadoop/hdfs/server/namenode/JournalSet$5
        at 
org.apache.hadoop.hdfs.server.namenode.JournalSet.close(JournalSet.java:243)
        at 
org.apache.hadoop.hdfs.server.namenode.FSEditLog.close(FSEditLog.java:368)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.stopActiveServices(FSNamesystem.java:1199)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.close(FSNamesystem.java:1537)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.stopCommonServices(NameNode.java:721)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNode.stop(NameNode.java:886)
        at 
org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1866)
        at 
org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1835)
        at 
org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1828)
        at 
org.apache.hadoop.mapred.join.TestDatamerge$1.tearDown(TestDatamerge.java:69)


REGRESSION:  org.apache.hadoop.mapreduce.TestMapReduceLazyOutput.testLazyOutput

Error Message:
null

Stack Trace:
junit.framework.AssertionFailedError: null
        at junit.framework.Assert.fail(Assert.java:55)
        at junit.framework.Assert.assertTrue(Assert.java:22)
        at junit.framework.Assert.assertTrue(Assert.java:31)
        at junit.framework.TestCase.assertTrue(TestCase.java:201)
        at 
org.apache.hadoop.mapreduce.TestMapReduceLazyOutput.runTestLazyOutput(TestMapReduceLazyOutput.java:110)
        at 
org.apache.hadoop.mapreduce.TestMapReduceLazyOutput.testLazyOutput(TestMapReduceLazyOutput.java:146)


REGRESSION:  
org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities.testJobWithNonNormalizedCapabilities

Error Message:
org/apache/hadoop/service/ServiceOperations

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/service/ServiceOperations
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at 
org.apache.hadoop.service.AbstractService.start(AbstractService.java:203)
        at 
org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
        at 
org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
        at 
org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities.setup(TestMRAMWithNonNormalizedCapabilities.java:72)


REGRESSION:  
org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities.testJobWithNonNormalizedCapabilities

Error Message:
org/apache/hadoop/service/ServiceOperations

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/service/ServiceOperations
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at 
org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
        at 
org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
        at 
org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
        at 
org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities.tearDown(TestMRAMWithNonNormalizedCapabilities.java:118)


REGRESSION:  org.apache.hadoop.mapreduce.v2.TestNonExistentJob.testGetInvalidJob

Error Message:
java.io.IOException: ResourceManager failed to start. Final state is STOPPED

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.io.IOException: 
ResourceManager failed to start. Final state is STOPPED
        at 
org.apache.hadoop.yarn.server.MiniYARNCluster.startResourceManager(MiniYARNCluster.java:332)
        at 
org.apache.hadoop.yarn.server.MiniYARNCluster.access$500(MiniYARNCluster.java:99)
        at 
org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceStart(MiniYARNCluster.java:456)
        at 
org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
        at 
org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
        at 
org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
        at 
org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
        at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
        at 
org.apache.hadoop.mapreduce.v2.TestNonExistentJob.setUp(TestNonExistentJob.java:73)


FAILED:  org.apache.hadoop.mapred.TestClusterMRNotification.testMR

Error Message:
Job cleanup didn't start in 20 seconds

Stack Trace:
java.io.IOException: Job cleanup didn't start in 20 seconds
        at 
org.apache.hadoop.mapred.UtilsForTests.runJobKill(UtilsForTests.java:685)
        at 
org.apache.hadoop.mapred.NotificationTestCase.testMR(NotificationTestCase.java:178)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at junit.framework.TestCase.runTest(TestCase.java:176)
        at junit.framework.TestCase.runBare(TestCase.java:141)
        at junit.framework.TestResult$1.protect(TestResult.java:122)
        at junit.framework.TestResult.runProtected(TestResult.java:142)
        at junit.framework.TestResult.run(TestResult.java:125)
        at junit.framework.TestCase.run(TestCase.java:129)
        at junit.framework.TestSuite.runTest(TestSuite.java:255)
        at junit.framework.TestSuite.run(TestSuite.java:250)
        at 
org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:84)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
        at 
org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
        at 
org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
        at 
org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
        at 
org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapred.TestLazyOutput.testLazyOutput

Error Message:
org/apache/hadoop/util/ShutdownThreadsHelper

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/util/ShutdownThreadsHelper
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at 
org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.serviceStop(HistoryFileManager.java:635)
        at 
org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
        at 
org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceStop(JobHistory.java:171)
        at 
org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
        at 
org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
        at 
org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
        at 
org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
        at 
org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
        at 
org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceStop(JobHistoryServer.java:211)
        at 
org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
        at 
org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStop(MiniMRYarnCluster.java:248)
        at 
org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
        at 
org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
        at 
org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
        at 
org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
        at 
org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
        at 
org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
        at 
org.apache.hadoop.mapred.MiniMRYarnClusterAdapter.stop(MiniMRYarnClusterAdapter.java:55)
        at 
org.apache.hadoop.mapred.MiniMRCluster.shutdown(MiniMRCluster.java:267)
        at 
org.apache.hadoop.mapred.TestLazyOutput.testLazyOutput(TestLazyOutput.java:195)


FAILED:  
org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.testSingleReducer

Error Message:
Path is not a file: /test/output/_temporary
 at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:75)
 at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)
 at 
org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getBlockLocations(FSDirStatAndListingOp.java:162)
 at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1670)
 at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:595)
 at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:365)
 at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
 at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:636)
 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:976)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2230)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2226)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:415)
 at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1667)
 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2226)


Stack Trace:
java.io.FileNotFoundException: Path is not a file: /test/output/_temporary
        at 
org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:75)
        at 
org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)
        at 
org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getBlockLocations(FSDirStatAndListingOp.java:162)
        at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1670)
        at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:595)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:365)
        at 
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:636)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:976)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2230)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2226)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1667)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2226)

        at org.apache.hadoop.ipc.Client.call(Client.java:1445)
        at org.apache.hadoop.ipc.Client.call(Client.java:1376)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
        at com.sun.proxy.$Proxy19.getBlockLocations(Unknown Source)
        at 
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:256)
        at sun.reflect.GeneratedMethodAccessor108.invoke(Unknown Source)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:251)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
        at com.sun.proxy.$Proxy20.getBlockLocations(Unknown Source)
        at 
org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:839)
        at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:826)
        at org.apache.hadoop.hdfs.DFSClient.getLocatedBlocks(DFSClient.java:814)
        at 
org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:315)
        at 
org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:277)
        at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:265)
        at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1060)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:276)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:271)
        at 
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at 
org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:284)
        at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:771)
        at 
org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.verifyOutput(TestMRIntermediateDataEncryption.java:176)
        at 
org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.runMergeTest(TestMRIntermediateDataEncryption.java:161)
        at 
org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.doEncryptionTest(TestMRIntermediateDataEncryption.java:95)
        at 
org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.doEncryptionTest(TestMRIntermediateDataEncryption.java:75)
        at 
org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.testSingleReducer(TestMRIntermediateDataEncryption.java:55)


FAILED:  org.apache.hadoop.mapred.TestNetworkedJob.testNetworkedJob

Error Message:
org.apache.hadoop.yarn.exceptions.YarnException: java.io.IOException: 
Delegation Token can be issued only with kerberos authentication
 at org.apache.hadoop.yarn.ipc.RPCUtil.getRemoteException(RPCUtil.java:38)
 at 
org.apache.hadoop.yarn.server.resourcemanager.ClientRMService.getDelegationToken(ClientRMService.java:959)
 at 
org.apache.hadoop.yarn.api.impl.pb.service.ApplicationClientProtocolPBServiceImpl.getDelegationToken(ApplicationClientProtocolPBServiceImpl.java:301)
 at 
org.apache.hadoop.yarn.proto.ApplicationClientProtocol$ApplicationClientProtocolService$2.callBlockingMethod(ApplicationClientProtocol.java:449)
 at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:636)
 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:976)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2230)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2226)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:415)
 at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1667)
 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2224)
Caused by: java.io.IOException: Delegation Token can be issued only with 
kerberos authentication
 at 
org.apache.hadoop.yarn.server.resourcemanager.ClientRMService.getDelegationToken(ClientRMService.java:932)
 ... 10 more


Stack Trace:
java.io.IOException: org.apache.hadoop.yarn.exceptions.YarnException: 
java.io.IOException: Delegation Token can be issued only with kerberos 
authentication
        at 
org.apache.hadoop.yarn.ipc.RPCUtil.getRemoteException(RPCUtil.java:38)
        at 
org.apache.hadoop.yarn.server.resourcemanager.ClientRMService.getDelegationToken(ClientRMService.java:959)
        at 
org.apache.hadoop.yarn.api.impl.pb.service.ApplicationClientProtocolPBServiceImpl.getDelegationToken(ApplicationClientProtocolPBServiceImpl.java:301)
        at 
org.apache.hadoop.yarn.proto.ApplicationClientProtocol$ApplicationClientProtocolService$2.callBlockingMethod(ApplicationClientProtocol.java:449)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:636)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:976)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2230)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2226)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1667)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2224)
Caused by: java.io.IOException: Delegation Token can be issued only with 
kerberos authentication
        at 
org.apache.hadoop.yarn.server.resourcemanager.ClientRMService.getDelegationToken(ClientRMService.java:932)
        ... 10 more

        at org.apache.hadoop.ipc.Client.call(Client.java:1445)
        at org.apache.hadoop.ipc.Client.call(Client.java:1376)
        at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
        at com.sun.proxy.$Proxy80.getDelegationToken(Unknown Source)
        at 
org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getDelegationToken(ApplicationClientProtocolPBClientImpl.java:316)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:251)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
        at com.sun.proxy.$Proxy81.getDelegationToken(Unknown Source)
        at 
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getRMDelegationToken(YarnClientImpl.java:529)
        at 
org.apache.hadoop.mapred.ResourceMgrDelegate.getDelegationToken(ResourceMgrDelegate.java:176)
        at 
org.apache.hadoop.mapred.YARNRunner.getDelegationToken(YARNRunner.java:231)
        at 
org.apache.hadoop.mapreduce.Cluster.getDelegationToken(Cluster.java:401)
        at org.apache.hadoop.mapred.JobClient$16.run(JobClient.java:1234)
        at org.apache.hadoop.mapred.JobClient$16.run(JobClient.java:1231)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1667)
        at 
org.apache.hadoop.mapred.JobClient.getDelegationToken(JobClient.java:1230)
        at 
org.apache.hadoop.mapred.TestNetworkedJob.testNetworkedJob(TestNetworkedJob.java:260)


Reply via email to