Do you see the same errors when you run the individual tests in question in
isolation, such as using mvn test -Dtest=<test-class>. If you do can you
paste the full logs of what you see when the individual tests fail.

Thanks

On Mon, Jun 18, 2018 at 11:41 AM Aaron Bossert <aa...@punchcyber.com> wrote:

> please disregard the first iteration...this ended up being related to a
> hung build running in the background causing timeouts, I think.  I am still
> having failures, but there are two and are still mysterious to me as to
> their root cause.  Here are the actual failures:
>
> I don't immediately see how these are related to Kryo at all...but then
> again, I am still familiarizing myself with the code base.  I am hoping
> that someone out there has a lightbulb turn on and has some notion of how
> they are related...
>
>
> -------------------------------------------------------------------------------
> Test set: com.datatorrent.stram.StramRecoveryTest
>
> -------------------------------------------------------------------------------
> Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.119 sec
> <<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
> testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time elapsed:
> 0.105 sec  <<< FAILURE!
> java.lang.AssertionError: flush count expected:<1> but was:<2>
> at
>
> com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
>
>
> -------------------------------------------------------------------------------
> Test set: com.datatorrent.stram.engine.StatsTest
>
> -------------------------------------------------------------------------------
> Tests run: 6, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 22.051 sec
> <<< FAILURE! - in com.datatorrent.stram.engine.StatsTest
>
> testQueueSizeForContainerLocalOperators(com.datatorrent.stram.engine.StatsTest)
>  Time elapsed: 3.277 sec  <<< FAILURE!
> java.lang.AssertionError: Validate input port queue size -1
> at
>
> com.datatorrent.stram.engine.StatsTest.baseTestForQueueSize(StatsTest.java:270)
> at
>
> com.datatorrent.stram.engine.StatsTest.testQueueSizeForContainerLocalOperators(StatsTest.java:285)
>
> On Mon, Jun 18, 2018 at 1:20 PM Aaron Bossert <aa...@punchcyber.com>
> wrote:
>
> > I recently attempted to update Kryo from 2.24.0 to 4.0.2 to address a
> > serialization issue related to support for Java Instant and a couple of
> > other classes that are supported in newer Kryo versions.  My test build
> and
> > install (vanilla, no changes of any kind, just download apex-core and
> > "clean install") works fine, however, when updating the Kryo dependency
> to
> > 4.0.2, getting this non-obvious (to me) error (running "clean install
> -X).
> > I also identified a bug or perhaps a feature?  When building on my macOS
> > laptop, I have an  Idea project folder in iCloud which is locally stored
> in
> > a directory that contains a space in the name, which needs to be escaped.
> > When I initially built, I kept running into errors related to that...not
> > sure if that is something that should be fixed (it is not as
> > straightforward as I had hoped) or simply require that directory names
> not
> > include any spaces.  I have no control of the iCloud local folder
> > name...otherwise, would have just fixed that.
> >
> > 2018-06-18 12:43:24,485 [main] ERROR stram.RecoverableRpcProxy invoke -
> > Giving up RPC connection recovery after 504 ms
> > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> 10.37.129.2
> >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65137
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:561)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65137
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > at
> > org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-18 12:43:24,987 [IPC Server handler 0 on 65136] WARN  ipc.Server
> > processResponse - IPC Server handler 0 on 65136, call log(containerId,
> > timeout), rpc version=2, client version=201208081755,
> > methodsFingerPrint=-1300451462 from 10.37.129.2:65137 Call#141 Retry#0:
> > output error
> > 2018-06-18 12:43:24,999 [main] WARN  stram.RecoverableRpcProxy invoke -
> > RPC failure, will retry after 100 ms (remaining 998 ms)
> > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> 10.37.129.2
> >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65138
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65138
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > at
> > org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-18 12:43:25,607 [main] WARN  stram.RecoverableRpcProxy invoke -
> > RPC failure, will retry after 100 ms (remaining 390 ms)
> > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> 10.37.129.2
> >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65139
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65139
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > at
> > org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-18 12:43:25,987 [IPC Server handler 0 on 65136] WARN  ipc.Server
> > processResponse - IPC Server handler 0 on 65136, call log(containerId,
> > timeout), rpc version=2, client version=201208081755,
> > methodsFingerPrint=-1300451462 from 10.37.129.2:65138 Call#142 Retry#0:
> > output error
> > 2018-06-18 12:43:26,603 [main] ERROR stram.RecoverableRpcProxy invoke -
> > Giving up RPC connection recovery after 501 ms
> > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> 10.37.129.2
> >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65141
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.log(Unknown Source)
> > at
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:596)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65141
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > at
> > org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-18 12:43:27,105 [IPC Server handler 0 on 65136] WARN  ipc.Server
> > processResponse - IPC Server handler 0 on 65136, call log(containerId,
> > timeout), rpc version=2, client version=201208081755,
> > methodsFingerPrint=-1300451462 from 10.37.129.2:65141 Call#146 Retry#0:
> > output error
> > 2018-06-18 12:43:27,114 [main] WARN  stram.RecoverableRpcProxy invoke -
> > RPC failure, will retry after 100 ms (remaining 995 ms)
> > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> 10.37.129.2
> >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65142
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > at
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65142
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > at
> > org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-18 12:43:27,722 [main] WARN  stram.RecoverableRpcProxy invoke -
> > RPC failure, will retry after 100 ms (remaining 387 ms)
> > java.net.SocketTimeoutException: Call From MacBook-Pro-6.local/
> 10.37.129.2
> >  to MacBook-Pro-6.local:65136 failed on socket timeout exception:
> > java.net.SocketTimeoutException: 500 millis timeout while waiting for
> > channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65143
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]; For more details see:
> > http://wiki.apache.org/hadoop/SocketTimeout
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:791)
> > at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:750)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > at
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > at
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.lang.reflect.Method.invoke(Method.java:498)
> > at
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > at
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > at
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > at
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > at
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > at
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.junit.runners.Suite.runChild(Suite.java:127)
> > at org.junit.runners.Suite.runChild(Suite.java:26)
> > at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > at org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > at
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > at
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > at
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > Caused by: java.net.SocketTimeoutException: 500 millis timeout while
> > waiting for channel to be ready for read. ch :
> > java.nio.channels.SocketChannel[connected local=/10.37.129.2:65143
> > remote=MacBook-Pro-6.local/10.37.129.2:65136]
> > at
> > org.apache.hadoop.net
> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:161)
> > at org.apache.hadoop.net
> .SocketInputStream.read(SocketInputStream.java:131)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > at
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > 2018-06-18 12:43:28,109 [IPC Server handler 0 on 65136] WARN  ipc.Server
> > processResponse - IPC Server handler 0 on 65136, call
> > reportError(containerId, null, timeout, null), rpc version=2, client
> > version=201208081755, methodsFingerPrint=-1300451462 from
> > 10.37.129.2:65142 Call#147 Retry#0: output error
> > 2018-06-18 12:43:28,292 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
> > 2018-06-18 12:43:28,423 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
> > 2018-06-18 12:43:28,491 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
> > 2018-06-18 12:43:28,492 [main] INFO  stram.StramClient copyInitialState -
> > Copying initial state took 32 ms
> > 2018-06-18 12:43:28,607 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
> > 2018-06-18 12:43:28,671 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
> > 2018-06-18 12:43:28,673 [main] INFO  stram.StramClient copyInitialState -
> > Copying initial state took 35 ms
> > 2018-06-18 12:43:28,805 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
> > 2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:28,830 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:29,046 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:29,046 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:29,047 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:29,226 [main] INFO  util.AsyncFSStorageAgent save -
> using
> > /Users/mbossert/testIdea/apex-core/engine/target/chkp1927717229509930939
> as
> > the basepath for checkpointing.
> > 2018-06-18 12:43:29,339 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
> > 2018-06-18 12:43:29,428 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
> > 2018-06-18 12:43:29,493 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
> > 2018-06-18 12:43:29,494 [main] INFO  stram.StramClient copyInitialState -
> > Copying initial state took 29 ms
> > 2018-06-18 12:43:29,592 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
> > 2018-06-18 12:43:29,649 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
> > 2018-06-18 12:43:29,651 [main] INFO  stram.StramClient copyInitialState -
> > Copying initial state took 32 ms
> > 2018-06-18 12:43:29,780 [main] INFO  stram.FSRecoveryHandler rotateLog -
> > Creating
> >
> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
> > 2018-06-18 12:43:29,808 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:29,809 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:29,809 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:29,809 [main] INFO  util.AsyncFSStorageAgent save -
> using
> > /Users/mbossert/testIdea/apex-core/engine/target/chkp1976097017195725194
> as
> > the basepath for checkpointing.
> > 2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:30,050 [main] WARN  physical.PhysicalPlan <init> -
> > Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container without
> > locality contraint due to insufficient resources.
> > 2018-06-18 12:43:30,051 [main] INFO  util.AsyncFSStorageAgent save -
> using
> > /Users/mbossert/testIdea/apex-core/engine/target/chkp3935270209625805644
> as
> > the basepath for checkpointing.
> > Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 6.329 sec
> > <<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
> > testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time elapsed:
> > 0.097 sec  <<< FAILURE!
> > java.lang.AssertionError: flush count expected:<1> but was:<2>
> > at
> >
> com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
> >
> >
> > --
> >
> > M. Aaron Bossert
> > (571) 242-4021
> > Punch Cyber Analytics Group
> >
> >
> >
>
> --
>
> M. Aaron Bossert
> (571) 242-4021
> Punch Cyber Analytics Group
>

Reply via email to