I suspect this error only happens once in a while. We didn't change anything on these tests recently. Your PR for fixing this issue looks good, maybe its fixing it.
On Thu, Mar 26, 2015 at 3:15 AM, Henry Saputra <henry.sapu...@gmail.com> wrote: > Hi All, > > I just pulled from master and seemed like it fails mvn test: > > > ------------------------------------------------------- > T E S T S > ------------------------------------------------------- > > ------------------------------------------------------- > T E S T S > ------------------------------------------------------- > Running org.apache.flink.tachyon.HDFSTest > Running org.apache.flink.tachyon.TachyonFileSystemWrapperTest > java.lang.NoSuchFieldError: IBM_JAVA > at > org.apache.hadoop.security.UserGroupInformation.getOSLoginModuleName(UserGroupInformation.java:303) > at > org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:348) > at > org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:807) > at > org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:266) > at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:122) > at > org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:775) > at > org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:642) > at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:334) > at > org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:316) > at org.apache.flink.tachyon.HDFSTest.createHDFS(HDFSTest.java:62) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at > org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47) > at > org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) > at > org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44) > at > org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24) > at > org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271) > at > org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70) > at > org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50) > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) > at org.junit.runners.ParentRunner.run(ParentRunner.java:309) > at > org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264) > at > org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153) > at > org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124) > at > org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200) > at > org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153) > at > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103) > java.lang.NoClassDefFoundError: Could not initialize class > org.apache.hadoop.security.UserGroupInformation > at > org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:807) > at > org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:266) > at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:122) > at > org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:775) > at > org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:642) > at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:334) > at > org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:316) > at org.apache.flink.tachyon.HDFSTest.createHDFS(HDFSTest.java:62) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at > org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47) > at > org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) > at > org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44) > at > org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24) > at > org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27) > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271) > at > org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70) > at > org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50) > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238) > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63) > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236) > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53) > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229) > at org.junit.runners.ParentRunner.run(ParentRunner.java:309) > at > org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264) > at > org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153) > at > org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124) > at > org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200) > at > org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153) > at > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103) > Tests run: 4, Failures: 2, Errors: 2, Skipped: 0, Time elapsed: 1.13 > sec <<< FAILURE! - in org.apache.flink.tachyon.HDFSTest > testHDFS(org.apache.flink.tachyon.HDFSTest) Time elapsed: 0.867 sec > <<< FAILURE! > java.lang.AssertionError: Test failed IBM_JAVA > at org.junit.Assert.fail(Assert.java:88) > at org.apache.flink.tachyon.HDFSTest.createHDFS(HDFSTest.java:76) > > testHDFS(org.apache.flink.tachyon.HDFSTest) Time elapsed: 0.867 sec <<< > ERROR! > java.lang.NullPointerException: null > at org.apache.flink.tachyon.HDFSTest.destroyHDFS(HDFSTest.java:83) > > testAvroOut(org.apache.flink.tachyon.HDFSTest) Time elapsed: 0.071 > sec <<< FAILURE! > java.lang.AssertionError: Test failed Could not initialize class > org.apache.hadoop.security.UserGroupInformation > at org.junit.Assert.fail(Assert.java:88) > at org.apache.flink.tachyon.HDFSTest.createHDFS(HDFSTest.java:76) > > testAvroOut(org.apache.flink.tachyon.HDFSTest) Time elapsed: 0.072 > sec <<< ERROR! > java.lang.NullPointerException: null > at org.apache.flink.tachyon.HDFSTest.destroyHDFS(HDFSTest.java:83) > > org.apache.flink.runtime.client.JobExecutionException: Cannot > initialize task 'DataSink (CsvOutputFormat (path: > tachyon://x1carbon:18998/result, delimiter: ))': Could not initialize > class org.apache.hadoop.security.UserGroupInformation > at > org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$2.apply(JobManager.scala:517) > at > org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$2.apply(JobManager.scala:501) > at scala.collection.Iterator$class.foreach(Iterator.scala:727) > at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) > at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) > at scala.collection.AbstractIterable.foreach(Iterable.scala:54) > at org.apache.flink.runtime.jobmanager.JobManager.org > $apache$flink$runtime$jobmanager$JobManager$$submitJob(JobManager.scala:501) > at > org.apache.flink.runtime.jobmanager.JobManager$$anonfun$receiveWithLogMessages$1.applyOrElse(JobManager.scala:183) > at > scala.runtime.AbstractPartialFunction$mcVL$sp.apply$mcVL$sp(AbstractPartialFunction.scala:33) > at > scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:33) > at > scala.runtime.AbstractPartialFunction$mcVL$sp.apply(AbstractPartialFunction.scala:25) > at > org.apache.flink.runtime.ActorLogMessages$$anon$1.apply(ActorLogMessages.scala:37) > at > org.apache.flink.runtime.ActorLogMessages$$anon$1.apply(ActorLogMessages.scala:30) > at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:118) > at > org.apache.flink.runtime.ActorLogMessages$$anon$1.applyOrElse(ActorLogMessages.scala:30) > at akka.actor.Actor$class.aroundReceive(Actor.scala:465) > at > org.apache.flink.runtime.jobmanager.JobManager.aroundReceive(JobManager.scala:89) > at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516) > at akka.actor.ActorCell.invoke(ActorCell.scala:487) > at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:254) > at akka.dispatch.Mailbox.run(Mailbox.scala:221) > at akka.dispatch.Mailbox.exec(Mailbox.scala:231) > at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) > at > scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.pollAndExecAll(ForkJoinPool.java:1253) > at > scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1346) > at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) > at > scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) > Caused by: java.lang.NoClassDefFoundError: Could not initialize class > org.apache.hadoop.security.UserGroupInformation > at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2590) > at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2582) > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2448) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:166) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:351) > at org.apache.hadoop.fs.Path.getFileSystem(Path.java:287) > at tachyon.hadoop.TFS.fromHdfsToTachyon(TFS.java:182) > at tachyon.hadoop.TFS.getFileStatus(TFS.java:243) > at > org.apache.flink.runtime.fs.hdfs.HadoopFileSystem.getFileStatus(HadoopFileSystem.java:343) > at org.apache.flink.core.fs.FileSystem.exists(FileSystem.java:414) > at > org.apache.flink.core.fs.FileSystem.initOutPathDistFS(FileSystem.java:679) > at > org.apache.flink.api.common.io.FileOutputFormat.initializeGlobal(FileOutputFormat.java:284) > at > org.apache.flink.runtime.jobgraph.OutputFormatVertex.initializeOnMaster(OutputFormatVertex.java:84) > at > org.apache.flink.runtime.jobmanager.JobManager$$anonfun$org$apache$flink$runtime$jobmanager$JobManager$$submitJob$2.apply(JobManager.scala:514) > ... 26 more > Tests run: 2, Failures: 1, Errors: 1, Skipped: 0, Time elapsed: 5.075 > sec <<< FAILURE! - in > org.apache.flink.tachyon.TachyonFileSystemWrapperTest > testTachyon(org.apache.flink.tachyon.TachyonFileSystemWrapperTest) > Time elapsed: 3.592 sec <<< FAILURE! > java.lang.AssertionError: Test failed with exception: Cannot > initialize task 'DataSink (CsvOutputFormat (path: > tachyon://x1carbon:18998/result, delimiter: ))': Could not initialize > class org.apache.hadoop.security.UserGroupInformation > at org.junit.Assert.fail(Assert.java:88) > at > org.apache.flink.tachyon.TachyonFileSystemWrapperTest.testTachyon(TachyonFileSystemWrapperTest.java:149) > > > testHadoopLoadability(org.apache.flink.tachyon.TachyonFileSystemWrapperTest) > Time elapsed: 1.288 sec <<< ERROR! > java.lang.NoClassDefFoundError: Could not initialize class > org.apache.hadoop.security.UserGroupInformation > at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2590) > at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2582) > at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2448) > at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367) > at org.apache.hadoop.fs.Path.getFileSystem(Path.java:287) > at > org.apache.flink.tachyon.TachyonFileSystemWrapperTest.testHadoopLoadability(TachyonFileSystemWrapperTest.java:116) > > > Results : > > Failed tests: > HDFSTest.createHDFS:76 Test failed IBM_JAVA > HDFSTest.createHDFS:76 Test failed Could not initialize class > org.apache.hadoop.security.UserGroupInformation > TachyonFileSystemWrapperTest.testTachyon:149 Test failed with > exception: Cannot initialize task 'DataSink (CsvOutputFormat (path: > tachyon://x1carbon:18998/result, delimiter: ))': Could not initialize > class org.apache.hadoop.security.UserGroupInformation > > Tests in error: > HDFSTest.destroyHDFS:83 NullPointer > HDFSTest.destroyHDFS:83 NullPointer > TachyonFileSystemWrapperTest.testHadoopLoadability:116 ยป > NoClassDefFound Could... > > Tests run: 6, Failures: 3, Errors: 3, Skipped: 0 > > > Anyone else seen this behavior? > > - Henry >