Looks like my change to use warn() instead of log.warn is causing issues for Jenkins.
[junit] 11/08/11 22:32:36 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin. SUBSTRING. java.lang.NullPointerException [junit] 11/08/11 22:32:36 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -2 [junit] 11/08/11 22:32:36 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -1 [junit] 11/08/11 22:32:36 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -8 [junit] 11/08/11 22:32:36 WARN builtin.SUBSTRING: No logger object provided to UDF: org.apache.pig.builtin.SUBSTRING. java.lang.StringIndexOutOfBoundsException: String index out of range: -2 [junit] 11/08/11 22:32:36 WARN builtin.INDEXOF: No logger object provided to UDF: org.apache.pig.builtin.INDEXOF. Failed to process input; error - null [junit] 11/08/11 22:32:36 WARN builtin.LAST_INDEX_OF: No logger object provided to UDF: org.apache.pig.builtin.LAST_INDEX_OF. Failed to process input; error - null Any idea if this is an environment thing or a me thing? D On Thu, Aug 11, 2011 at 3:32 PM, Apache Jenkins Server < jenk...@builds.apache.org> wrote: > See <https://builds.apache.org/job/Pig-trunk/1061/changes> > > Changes: > > [dvryaboy] PIG-2174: HBaseStorage column filters miss some fields > > ------------------------------------------ > [...truncated 38358 lines...] > [junit] at javax.security.auth.Subject.doAs(Subject.java:396) > [junit] at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953) > [junit] 11/08/11 22:32:34 ERROR hdfs.DFSClient: Exception closing file > /tmp/TestStore-output--2391036896539643097.txt_cleanupOnFailure_succeeded2 : > org.apache.hadoop.ipc.RemoteException: java.io.IOException: Could not > complete write to file > /tmp/TestStore-output--2391036896539643097.txt_cleanupOnFailure_succeeded2 > by DFSClient_1110622717 > [junit] at > org.apache.hadoop.hdfs.server.namenode.NameNode.complete(NameNode.java:449) > [junit] at sun.reflect.GeneratedMethodAccessor18.invoke(Unknown > Source) > [junit] at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > [junit] at java.lang.reflect.Method.invoke(Method.java:597) > [junit] at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508) > [junit] at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959) > [junit] at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955) > [junit] at java.security.AccessController.doPrivileged(Native > Method) > [junit] at javax.security.auth.Subject.doAs(Subject.java:396) > [junit] at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953) > [junit] > [junit] org.apache.hadoop.ipc.RemoteException: java.io.IOException: > Could not complete write to file > /tmp/TestStore-output--2391036896539643097.txt_cleanupOnFailure_succeeded2 > by DFSClient_1110622717 > [junit] at > org.apache.hadoop.hdfs.server.namenode.NameNode.complete(NameNode.java:449) > [junit] at sun.reflect.GeneratedMethodAccessor18.invoke(Unknown > Source) > [junit] at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > [junit] at java.lang.reflect.Method.invoke(Method.java:597) > [junit] at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508) > [junit] at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959) > [junit] at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955) > [junit] at java.security.AccessController.doPrivileged(Native > Method) > [junit] at javax.security.auth.Subject.doAs(Subject.java:396) > [junit] at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953) > [junit] > [junit] at org.apache.hadoop.ipc.Client.call(Client.java:740) > [junit] at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220) > [junit] at $Proxy0.complete(Unknown Source) > [junit] at sun.reflect.GeneratedMethodAccessor18.invoke(Unknown > Source) > [junit] at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > [junit] at java.lang.reflect.Method.invoke(Method.java:597) > [junit] at > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82) > [junit] at > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59) > [junit] at $Proxy0.complete(Unknown Source) > [junit] at > org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.closeInternal(DFSClient.java:3264) > [junit] at > org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.close(DFSClient.java:3188) > [junit] at > org.apache.hadoop.hdfs.DFSClient$LeaseChecker.close(DFSClient.java:1043) > [junit] at > org.apache.hadoop.hdfs.DFSClient.close(DFSClient.java:237) > [junit] at > org.apache.hadoop.hdfs.DistributedFileSystem.close(DistributedFileSystem.java:269) > [junit] at > org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:83) > [junit] at > org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77) > [junit] at > org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68) > [junit] at > org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:127) > [junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native > Method) > [junit] at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > [junit] at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > [junit] at java.lang.reflect.Method.invoke(Method.java:597) > [junit] at > org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) > [junit] Shutting down the Mini HDFS Cluster > [junit] at > org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) > [junit] at > org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) > [junit] Shutting down DataNode 3 > [junit] at > org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37) > [junit] at org.junit.runners.ParentRunner.run(ParentRunner.java:220) > [junit] at > junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) > [junit] at > org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420) > [junit] at > org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911) > [junit] at > org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768) > [junit] 11/08/11 22:32:34 WARN hdfs.StateChange: DIR* > NameSystem.completeFile: failed to complete > /tmp/TestStore-output--6146295549410389201.txt_cleanupOnFailure_succeeded > because dir.getFileBlocks() is null and pendingFile is null > [junit] 11/08/11 22:32:34 INFO ipc.Server: IPC Server handler 7 on > 55915, call > complete(/tmp/TestStore-output--6146295549410389201.txt_cleanupOnFailure_succeeded, > DFSClient_1110622717) from 127.0.0.1:34096: error: java.io.IOException: > Could not complete write to file > /tmp/TestStore-output--6146295549410389201.txt_cleanupOnFailure_succeeded by > DFSClient_1110622717 > [junit] java.io.IOException: Could not complete write to file > /tmp/TestStore-output--6146295549410389201.txt_cleanupOnFailure_succeeded by > DFSClient_1110622717 > [junit] at > org.apache.hadoop.hdfs.server.namenode.NameNode.complete(NameNode.java:449) > [junit] at sun.reflect.GeneratedMethodAccessor18.invoke(Unknown > Source) > [junit] at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > [junit] at java.lang.reflect.Method.invoke(Method.java:597) > [junit] at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508) > [junit] at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959) > [junit] at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955) > [junit] at java.security.AccessController.doPrivileged(Native > Method) > [junit] at javax.security.auth.Subject.doAs(Subject.java:396) > [junit] at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953) > [junit] 11/08/11 22:32:34 ERROR hdfs.DFSClient: Exception closing file > /tmp/TestStore-output--6146295549410389201.txt_cleanupOnFailure_succeeded : > org.apache.hadoop.ipc.RemoteException: java.io.IOException: Could not > complete write to file > /tmp/TestStore-output--6146295549410389201.txt_cleanupOnFailure_succeeded by > DFSClient_1110622717 > [junit] at > org.apache.hadoop.hdfs.server.namenode.NameNode.complete(NameNode.java:449) > [junit] at sun.reflect.GeneratedMethodAccessor18.invoke(Unknown > Source) > [junit] at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > [junit] at java.lang.reflect.Method.invoke(Method.java:597) > [junit] at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508) > [junit] at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959) > [junit] at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955) > [junit] at java.security.AccessController.doPrivileged(Native > Method) > [junit] at javax.security.auth.Subject.doAs(Subject.java:396) > [junit] at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953) > [junit] > [junit] org.apache.hadoop.ipc.RemoteException: java.io.IOException: > Could not complete write to file > /tmp/TestStore-output--6146295549410389201.txt_cleanupOnFailure_succeeded by > DFSClient_1110622717 > [junit] at > org.apache.hadoop.hdfs.server.namenode.NameNode.complete(NameNode.java:449) > [junit] at sun.reflect.GeneratedMethodAccessor18.invoke(Unknown > Source) > [junit] at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > [junit] at java.lang.reflect.Method.invoke(Method.java:597) > [junit] at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508) > [junit] at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959) > [junit] at > org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955) > [junit] at java.security.AccessController.doPrivileged(Native > Method) > [junit] at javax.security.auth.Subject.doAs(Subject.java:396) > [junit] at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953) > [junit] > [junit] at org.apache.hadoop.ipc.Client.call(Client.java:740) > [junit] at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220) > [junit] at $Proxy0.complete(Unknown Source) > [junit] at sun.reflect.GeneratedMethodAccessor18.invoke(Unknown > Source) > [junit] at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > [junit] at java.lang.reflect.Method.invoke(Method.java:597) > [junit] at > org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82) > [junit] at > org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59) > [junit] at $Proxy0.complete(Unknown Source) > [junit] at > org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.closeInternal(DFSClient.java:3264) > [junit] at > org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.close(DFSClient.java:3188) > [junit] at > org.apache.hadoop.hdfs.DFSClient$LeaseChecker.close(DFSClient.java:1043) > [junit] at > org.apache.hadoop.hdfs.DFSClient.close(DFSClient.java:237) > [junit] at > org.apache.hadoop.hdfs.DistributedFileSystem.close(DistributedFileSystem.java:269) > [junit] at > org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsClusters(MiniGenericCluster.java:83) > [junit] at > org.apache.pig.test.MiniGenericCluster.shutdownMiniDfsAndMrClusters(MiniGenericCluster.java:77) > [junit] at > org.apache.pig.test.MiniGenericCluster.shutDown(MiniGenericCluster.java:68) > [junit] at > org.apache.pig.test.TestStore.oneTimeTearDown(TestStore.java:127) > [junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native > Method) > [junit] at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) > [junit] at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) > [junit] at java.lang.reflect.Method.invoke(Method.java:597) > [junit] at > org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44) > [junit] at > org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15) > [junit] at > org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41) > [junit] at > org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:37) > [junit] at org.junit.runners.ParentRunner.run(ParentRunner.java:220) > [junit] at > junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39) > [junit] at > org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420) > [junit] at > org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911) > [junit] at > org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768) > [junit] 11/08/11 22:32:34 INFO ipc.Server: Stopping server on 57381 > [junit] 11/08/11 22:32:34 INFO ipc.Server: IPC Server handler 0 on > 57381: exiting > [junit] 11/08/11 22:32:34 INFO ipc.Server: IPC Server handler 1 on > 57381: exiting > [junit] 11/08/11 22:32:34 INFO ipc.Server: IPC Server handler 2 on > 57381: exiting > [junit] 11/08/11 22:32:34 INFO ipc.Server: Stopping IPC Server Responder > [junit] 11/08/11 22:32:34 INFO datanode.DataNode: Waiting for > threadgroup to exit, active threads is 1 > [junit] 11/08/11 22:32:34 INFO ipc.Server: Stopping IPC Server listener > on 57381 > [junit] 11/08/11 22:32:34 WARN datanode.DataNode: DatanodeRegistration( > 127.0.0.1:59929, > storageID=DS-1682051249-67.195.138.24-59929-1313101600271, infoPort=40107, > ipcPort=57381):DataXceiveServer: > java.nio.channels.AsynchronousCloseException > [junit] at > java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185) > [junit] at > sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159) > [junit] at > sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84) > [junit] at > org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:130) > [junit] at java.lang.Thread.run(Thread.java:662) > [junit] > [junit] 11/08/11 22:32:35 INFO hdfs.StateChange: BLOCK* ask > 127.0.0.1:59929 to delete blk_2432976192841571024_1123 > blk_-7679899745762152581_1122 blk_2019879002996493503_1124 > [junit] 11/08/11 22:32:35 INFO hdfs.StateChange: BLOCK* ask > 127.0.0.1:43086 to delete blk_5900480943187652691_1121 > blk_-6745305478165067570_1126 blk_3960175366372076002_1127 > blk_-7679899745762152581_1122 blk_2019879002996493503_1124 > [junit] 11/08/11 22:32:35 INFO datanode.DataBlockScanner: Exiting > DataBlockScanner thread. > [junit] 11/08/11 22:32:35 INFO datanode.DataNode: Waiting for > threadgroup to exit, active threads is 0 > [junit] 11/08/11 22:32:35 INFO datanode.DataNode: DatanodeRegistration( > 127.0.0.1:59929, > storageID=DS-1682051249-67.195.138.24-59929-1313101600271, infoPort=40107, > ipcPort=57381):Finishing DataNode in: FSDataset{dirpath='< > https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data7/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data8/current' > }> > [junit] 11/08/11 22:32:35 INFO ipc.Server: Stopping server on 57381 > [junit] 11/08/11 22:32:35 INFO datanode.DataNode: Waiting for > threadgroup to exit, active threads is 0 > [junit] Shutting down DataNode 2 > [junit] 11/08/11 22:32:35 INFO ipc.Server: Stopping server on 43101 > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 0 on > 43101: exiting > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 1 on > 43101: exiting > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 2 on > 43101: exiting > [junit] 11/08/11 22:32:35 INFO ipc.Server: Stopping IPC Server listener > on 43101 > [junit] 11/08/11 22:32:35 INFO ipc.Server: Stopping IPC Server Responder > [junit] 11/08/11 22:32:35 WARN datanode.DataNode: DatanodeRegistration( > 127.0.0.1:43086, storageID=DS-358171583-67.195.138.24-43086-1313101600021, > infoPort=34545, ipcPort=43101):DataXceiveServer: > java.nio.channels.AsynchronousCloseException > [junit] at > java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185) > [junit] at > sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159) > [junit] at > sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84) > [junit] at > org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:130) > [junit] at java.lang.Thread.run(Thread.java:662) > [junit] > [junit] 11/08/11 22:32:35 INFO datanode.DataNode: Waiting for > threadgroup to exit, active threads is 0 > [junit] 11/08/11 22:32:35 INFO datanode.DataBlockScanner: Exiting > DataBlockScanner thread. > [junit] 11/08/11 22:32:35 INFO datanode.DataNode: DatanodeRegistration( > 127.0.0.1:43086, storageID=DS-358171583-67.195.138.24-43086-1313101600021, > infoPort=34545, ipcPort=43101):Finishing DataNode in: FSDataset{dirpath='< > https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data5/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data6/current' > }> > [junit] 11/08/11 22:32:35 INFO ipc.Server: Stopping server on 43101 > [junit] 11/08/11 22:32:35 INFO datanode.DataNode: Waiting for > threadgroup to exit, active threads is 0 > [junit] Shutting down DataNode 1 > [junit] 11/08/11 22:32:35 INFO ipc.Server: Stopping server on 49157 > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 0 on > 49157: exiting > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 2 on > 49157: exiting > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 1 on > 49157: exiting > [junit] 11/08/11 22:32:35 INFO ipc.Server: Stopping IPC Server listener > on 49157 > [junit] 11/08/11 22:32:35 INFO ipc.Server: Stopping IPC Server Responder > [junit] 11/08/11 22:32:35 WARN datanode.DataNode: DatanodeRegistration( > 127.0.0.1:44247, storageID=DS-775736738-67.195.138.24-44247-1313101599770, > infoPort=46805, ipcPort=49157):DataXceiveServer: > java.nio.channels.AsynchronousCloseException > [junit] at > java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185) > [junit] at > sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159) > [junit] at > sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84) > [junit] at > org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:130) > [junit] at java.lang.Thread.run(Thread.java:662) > [junit] > [junit] 11/08/11 22:32:35 INFO datanode.DataNode: Waiting for > threadgroup to exit, active threads is 0 > [junit] 11/08/11 22:32:35 INFO datanode.DataBlockScanner: Exiting > DataBlockScanner thread. > [junit] 11/08/11 22:32:35 INFO datanode.DataNode: DatanodeRegistration( > 127.0.0.1:44247, storageID=DS-775736738-67.195.138.24-44247-1313101599770, > infoPort=46805, ipcPort=49157):Finishing DataNode in: FSDataset{dirpath='< > https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data3/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data4/current' > }> > [junit] 11/08/11 22:32:35 INFO ipc.Server: Stopping server on 49157 > [junit] 11/08/11 22:32:35 INFO datanode.DataNode: Waiting for > threadgroup to exit, active threads is 0 > [junit] Shutting down DataNode 0 > [junit] 11/08/11 22:32:35 INFO ipc.Server: Stopping server on 34624 > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 0 on > 34624: exiting > [junit] 11/08/11 22:32:35 INFO ipc.Server: Stopping IPC Server Responder > [junit] 11/08/11 22:32:35 INFO ipc.Server: Stopping IPC Server listener > on 34624 > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 2 on > 34624: exiting > [junit] 11/08/11 22:32:35 WARN datanode.DataNode: DatanodeRegistration( > 127.0.0.1:35315, storageID=DS-906245876-67.195.138.24-35315-1313101599270, > infoPort=46020, ipcPort=34624):DataXceiveServer: > java.nio.channels.AsynchronousCloseException > [junit] at > java.nio.channels.spi.AbstractInterruptibleChannel.end(AbstractInterruptibleChannel.java:185) > [junit] at > sun.nio.ch.ServerSocketChannelImpl.accept(ServerSocketChannelImpl.java:159) > [junit] at > sun.nio.ch.ServerSocketAdaptor.accept(ServerSocketAdaptor.java:84) > [junit] at > org.apache.hadoop.hdfs.server.datanode.DataXceiverServer.run(DataXceiverServer.java:130) > [junit] at java.lang.Thread.run(Thread.java:662) > [junit] > [junit] 11/08/11 22:32:35 INFO datanode.DataNode: Waiting for > threadgroup to exit, active threads is 1 > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 1 on > 34624: exiting > [junit] 11/08/11 22:32:35 INFO datanode.DataBlockScanner: Exiting > DataBlockScanner thread. > [junit] 11/08/11 22:32:35 INFO datanode.DataNode: DatanodeRegistration( > 127.0.0.1:35315, storageID=DS-906245876-67.195.138.24-35315-1313101599270, > infoPort=46020, ipcPort=34624):Finishing DataNode in: FSDataset{dirpath='< > https://builds.apache.org/job/Pig-trunk/ws/trunk/build/test/data/dfs/data/data1/current,/home/jenkins/jenkins-slave/workspace/Pig-trunk/trunk/build/test/data/dfs/data/data2/current' > }> > [junit] 11/08/11 22:32:35 INFO ipc.Server: Stopping server on 34624 > [junit] 11/08/11 22:32:35 INFO datanode.DataNode: Waiting for > threadgroup to exit, active threads is 0 > [junit] 11/08/11 22:32:35 WARN namenode.FSNamesystem: ReplicationMonitor > thread received InterruptedException.java.lang.InterruptedException: sleep > interrupted > [junit] 11/08/11 22:32:35 INFO namenode.FSNamesystem: Number of > transactions: 694 Total time for transactions(ms): 9Number of transactions > batched in Syncs: 114 Number of syncs: 484 SyncTimes(ms): 6591 349 > [junit] 11/08/11 22:32:35 INFO namenode.DecommissionManager: Interrupted > Monitor > [junit] java.lang.InterruptedException: sleep interrupted > [junit] at java.lang.Thread.sleep(Native Method) > [junit] at > org.apache.hadoop.hdfs.server.namenode.DecommissionManager$Monitor.run(DecommissionManager.java:65) > [junit] at java.lang.Thread.run(Thread.java:662) > [junit] 11/08/11 22:32:35 INFO ipc.Server: Stopping server on 55915 > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 0 on > 55915: exiting > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 1 on > 55915: exiting > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 2 on > 55915: exiting > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 3 on > 55915: exiting > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 4 on > 55915: exiting > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 5 on > 55915: exiting > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 6 on > 55915: exiting > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 7 on > 55915: exiting > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 8 on > 55915: exiting > [junit] 11/08/11 22:32:35 INFO ipc.Server: IPC Server handler 9 on > 55915: exiting > [junit] 11/08/11 22:32:35 INFO ipc.Server: Stopping IPC Server listener > on 55915 > [junit] 11/08/11 22:32:35 INFO ipc.Server: Stopping IPC Server Responder > [junit] Tests run: 17, Failures: 0, Errors: 0, Time elapsed: 353.659 sec > [junit] Running org.apache.pig.test.TestStringUDFs > [junit] 11/08/11 22:32:36 WARN builtin.SUBSTRING: No logger object > provided to UDF: org.apache.pig.builtin.SUBSTRING. > java.lang.NullPointerException > [junit] 11/08/11 22:32:36 WARN builtin.SUBSTRING: No logger object > provided to UDF: org.apache.pig.builtin.SUBSTRING. > java.lang.StringIndexOutOfBoundsException: String index out of range: -2 > [junit] 11/08/11 22:32:36 WARN builtin.SUBSTRING: No logger object > provided to UDF: org.apache.pig.builtin.SUBSTRING. > java.lang.StringIndexOutOfBoundsException: String index out of range: -1 > [junit] 11/08/11 22:32:36 WARN builtin.SUBSTRING: No logger object > provided to UDF: org.apache.pig.builtin.SUBSTRING. > java.lang.StringIndexOutOfBoundsException: String index out of range: -8 > [junit] 11/08/11 22:32:36 WARN builtin.SUBSTRING: No logger object > provided to UDF: org.apache.pig.builtin.SUBSTRING. > java.lang.StringIndexOutOfBoundsException: String index out of range: -2 > [junit] 11/08/11 22:32:36 WARN builtin.INDEXOF: No logger object > provided to UDF: org.apache.pig.builtin.INDEXOF. Failed to process input; > error - null > [junit] 11/08/11 22:32:36 WARN builtin.LAST_INDEX_OF: No logger object > provided to UDF: org.apache.pig.builtin.LAST_INDEX_OF. Failed to process > input; error - null > [junit] Tests run: 11, Failures: 0, Errors: 0, Time elapsed: 0.084 sec > [delete] Deleting directory /tmp/pig_junit_tmp58053297 > > BUILD FAILED > <https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:673: The > following error occurred while executing this line: > <https://builds.apache.org/job/Pig-trunk/ws/trunk/build.xml>:728: Tests > failed! > > Total time: 21 minutes 59 seconds > [FINDBUGS] Skipping publisher since build result is FAILURE > Recording test results > Publishing Javadoc > Archiving artifacts > Recording fingerprints > Publishing Clover coverage report... > No Clover report will be published due to a Build Failure > >