Sorry, a bit off topic, but what replaces
com.datatorrent.api.StatsListener? It is marked as deprecated.

On Wed, Jun 20, 2018 at 2:12 PM Aaron Bossert <aa...@punchcyber.com> wrote:

> I agree.  I am porting a prototype that I built using Storm to Apex.  The
> runtime config changes were the impetus for the switch.  I need to be able
> to adjust settings, compute resources, and topology components (add,
> remove, modify) on the fly...Implementing those features from scratch in
> Storm would have been a massive undertaking.  Currently, there are Apache
> Metron and Spot, which are backed by Hortonworks and Cloudera,
> respectively...this will be a competitor to both of them.  The project will
> be open-sourced once sufficient progress is made.
>
> On Wed, Jun 20, 2018 at 2:02 PM Pramod Immaneni <pramod.imman...@gmail.com>
> wrote:
>
>> That would be awesome, it would be a good use case for the platform.
>>
>> On Wed, Jun 20, 2018 at 10:54 AM Aaron Bossert <aa...@punchcyber.com>
>> wrote:
>>
>> > I'm with you.  I'm hoping to have it underpin a fairly large network
>> > security pipeline...
>> >
>> > On Wed, Jun 20, 2018 at 1:29 PM Vlad Rozov <vro...@apache.org> wrote:
>> >
>> > > +1. Apex does not have as large community as other Apache project, but
>> > > let's try to build it.
>> > >
>> > > Thank you,
>> > >
>> > > Vlad
>> > >
>> > > On 6/20/18 10:24, Pramod Immaneni wrote:
>> > > > Aaron,
>> > > >
>> > > > Your concerns are legitimate, the number of folks contributing in
>> terms
>> > > of
>> > > > code commits has reduced quite a bit. I do however still see
>> interest
>> > > from
>> > > > ex-contributors and committers based on their participation in the
>> > > > discussions on dev list when important topics come up. It is up to
>> be
>> > > seen
>> > > > how the recent changes with DataTorrent will play out and if
>> > > ex-DataTorrent
>> > > > folks will contribute in the upcoming months. There was a new
>> > initiative
>> > > > started by an ex-DataTorrent employee a few days back on adding a
>> basic
>> > > UI
>> > > > to Apex so maybe there is hope.
>> > > >
>> > > > Having said that a few of us long timers have stuck it out through
>> > thick
>> > > > and thin and willing to help whenever it is possible.
>> > > >
>> > > > Thanks
>> > > >
>> > > > On Wed, Jun 20, 2018 at 9:19 AM Aaron Bossert <aa...@punchcyber.com
>> >
>> > > wrote:
>> > > >
>> > > >> Gentlemen,
>> > > >>
>> > > >> I am working at the start of a fairly large project.  I have some
>> > > questions
>> > > >> related to the general health of Apex...need to get a warm and
>> fuzzy
>> > > >> feeling that the project is not going to die on the vine as it
>> were.
>> > > >>
>> > > >> I am seeing the volume of commits and contributor activity dropped
>> off
>> > > >> significantly since early 2016 and there has been a drop again
>> after
>> > > >> Datatorrent folded...What is your sense of the project?  I really
>> like
>> > > the
>> > > >> framework and definitely would prefer to use it as well as
>> contribute
>> > > >> back...just want to make sure I am not going to have work on it
>> solo
>> > or
>> > > >> worse, end up having to switch to something else later...
>> > > >>
>> > > >> Your thoughts?
>> > > >>
>> > > >> Aaron
>> > > >>
>> > > >> On Wed, Jun 20, 2018 at 12:10 PM Pramod Immaneni <
>> > > >> pramod.imman...@gmail.com>
>> > > >> wrote:
>> > > >>
>> > > >>> There are hadoop IPC calls are failing possibly because of its
>> > reliance
>> > > >> on
>> > > >>> kryo for serializing the payload and there is some incompatibility
>> > with
>> > > >> the
>> > > >>> new version. I will dig in more to see what is going on.
>> > > >>>
>> > > >>> On Tue, Jun 19, 2018 at 6:54 PM Aaron Bossert <
>> aa...@punchcyber.com>
>> > > >>> wrote:
>> > > >>>
>> > > >>>> Pramod,
>> > > >>>>
>> > > >>>> Thanks for taking the time to help!
>> > > >>>>
>> > > >>>> Here is the output (just failed parts) when running full install
>> > > (clean
>> > > >>>> install -X) on the Master branch:
>> > > >>>>
>> > > >>>> Running com.datatorrent.stram.StramRecoveryTest
>> > > >>>> 2018-06-19 21:34:28,137 [main] INFO  stram.StramRecoveryTest
>> > > >>>> testRpcFailover - Mock server listening at macbook-pro-6.lan/
>> > > >>>> 192.168.87.125:62154
>> > > >>>> 2018-06-19 21:34:28,678 [main] ERROR stram.RecoverableRpcProxy
>> > invoke
>> > > -
>> > > >>>> Giving up RPC connection recovery after 507 ms
>> > > >>>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
>> > > >>>> 192.168.87.125
>> > > >>>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> > > >>>> java.net.SocketTimeoutException: 500 millis timeout while
>> waiting
>> > for
>> > > >>>> channel to be ready for read. ch :
>> > > >>>> java.nio.channels.SocketChannel[connected local=/
>> > 192.168.87.125:62155
>> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details
>> > see:
>> > > >>>> http://wiki.apache.org/hadoop/SocketTimeout
>> > > >>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> > > >> Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> > > >>>> at
>> java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> > > >>>> at org.apache.hadoop.net
>> > .NetUtils.wrapWithMessage(NetUtils.java:791)
>> > > >>>> at org.apache.hadoop.net
>> .NetUtils.wrapException(NetUtils.java:750)
>> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> > > >>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> > > >>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:561)
>> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> > > >>>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> > > >>>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> > > >>>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> > > >>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> > > >>>> at
>> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> > > >>>> at
>> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> > > >>>> at
>> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> > > >>>> at
>> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> > > >>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> > > >>>> at
>> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> > > >>>> at
>> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> > > >>>> at
>> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> > > >>>> at
>> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> > > >>>> at
>> > > org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> > > >>>> at
>> > > >>>>
>> > > >>
>> > >
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> > > >>>> Caused by: java.net.SocketTimeoutException: 500 millis timeout
>> > while
>> > > >>>> waiting for channel to be ready for read. ch :
>> > > >>>> java.nio.channels.SocketChannel[connected local=/
>> > 192.168.87.125:62155
>> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> > > >>>> at
>> > > >>>> org.apache.hadoop.net
>> > > >>>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> > > >>>> at org.apache.hadoop.net
>> > > >>>> .SocketInputStream.read(SocketInputStream.java:161)
>> > > >>>> at org.apache.hadoop.net
>> > > >>>> .SocketInputStream.read(SocketInputStream.java:131)
>> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> > > >>>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> > > >>>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> > > >>>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> > > >>>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> > > >>>> 2018-06-19 21:34:29,178 [IPC Server handler 0 on 62154] WARN
>> > > >> ipc.Server
>> > > >>>> processResponse - IPC Server handler 0 on 62154, call
>> > log(containerId,
>> > > >>>> timeout), rpc version=2, client version=201208081755,
>> > > >>>> methodsFingerPrint=-1300451462 from 192.168.87.125:62155
>> Call#136
>> > > >>> Retry#0:
>> > > >>>> output error
>> > > >>>> 2018-06-19 21:34:29,198 [main] WARN  stram.RecoverableRpcProxy
>> > invoke
>> > > -
>> > > >>> RPC
>> > > >>>> failure, will retry after 100 ms (remaining 994 ms)
>> > > >>>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
>> > > >>>> 192.168.87.125
>> > > >>>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> > > >>>> java.net.SocketTimeoutException: 500 millis timeout while
>> waiting
>> > for
>> > > >>>> channel to be ready for read. ch :
>> > > >>>> java.nio.channels.SocketChannel[connected local=/
>> > 192.168.87.125:62156
>> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details
>> > see:
>> > > >>>> http://wiki.apache.org/hadoop/SocketTimeout
>> > > >>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> > > >> Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> > > >>>> at
>> java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> > > >>>> at org.apache.hadoop.net
>> > .NetUtils.wrapWithMessage(NetUtils.java:791)
>> > > >>>> at org.apache.hadoop.net
>> .NetUtils.wrapException(NetUtils.java:750)
>> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> > > >>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> > > >>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
>> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> > > >>>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> > > >>>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> > > >>>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> > > >>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> > > >>>> at
>> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> > > >>>> at
>> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> > > >>>> at
>> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> > > >>>> at
>> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> > > >>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> > > >>>> at
>> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> > > >>>> at
>> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> > > >>>> at
>> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> > > >>>> at
>> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> > > >>>> at
>> > > org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> > > >>>> at
>> > > >>>>
>> > > >>
>> > >
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> > > >>>> Caused by: java.net.SocketTimeoutException: 500 millis timeout
>> > while
>> > > >>>> waiting for channel to be ready for read. ch :
>> > > >>>> java.nio.channels.SocketChannel[connected local=/
>> > 192.168.87.125:62156
>> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> > > >>>> at
>> > > >>>> org.apache.hadoop.net
>> > > >>>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> > > >>>> at org.apache.hadoop.net
>> > > >>>> .SocketInputStream.read(SocketInputStream.java:161)
>> > > >>>> at org.apache.hadoop.net
>> > > >>>> .SocketInputStream.read(SocketInputStream.java:131)
>> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> > > >>>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> > > >>>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> > > >>>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> > > >>>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> > > >>>> 2018-06-19 21:34:29,806 [main] WARN  stram.RecoverableRpcProxy
>> > invoke
>> > > -
>> > > >>> RPC
>> > > >>>> failure, will retry after 100 ms (remaining 386 ms)
>> > > >>>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
>> > > >>>> 192.168.87.125
>> > > >>>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> > > >>>> java.net.SocketTimeoutException: 500 millis timeout while
>> waiting
>> > for
>> > > >>>> channel to be ready for read. ch :
>> > > >>>> java.nio.channels.SocketChannel[connected local=/
>> > 192.168.87.125:62157
>> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details
>> > see:
>> > > >>>> http://wiki.apache.org/hadoop/SocketTimeout
>> > > >>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> > > >> Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> > > >>>> at
>> java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> > > >>>> at org.apache.hadoop.net
>> > .NetUtils.wrapWithMessage(NetUtils.java:791)
>> > > >>>> at org.apache.hadoop.net
>> .NetUtils.wrapException(NetUtils.java:750)
>> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> > > >>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> > > >>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
>> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> > > >>>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> > > >>>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> > > >>>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> > > >>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> > > >>>> at
>> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> > > >>>> at
>> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> > > >>>> at
>> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> > > >>>> at
>> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> > > >>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> > > >>>> at
>> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> > > >>>> at
>> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> > > >>>> at
>> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> > > >>>> at
>> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> > > >>>> at
>> > > org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> > > >>>> at
>> > > >>>>
>> > > >>
>> > >
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> > > >>>> Caused by: java.net.SocketTimeoutException: 500 millis timeout
>> > while
>> > > >>>> waiting for channel to be ready for read. ch :
>> > > >>>> java.nio.channels.SocketChannel[connected local=/
>> > 192.168.87.125:62157
>> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> > > >>>> at
>> > > >>>> org.apache.hadoop.net
>> > > >>>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> > > >>>> at org.apache.hadoop.net
>> > > >>>> .SocketInputStream.read(SocketInputStream.java:161)
>> > > >>>> at org.apache.hadoop.net
>> > > >>>> .SocketInputStream.read(SocketInputStream.java:131)
>> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> > > >>>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> > > >>>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> > > >>>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> > > >>>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> > > >>>> 2018-06-19 21:34:30,180 [IPC Server handler 0 on 62154] WARN
>> > > >> ipc.Server
>> > > >>>> processResponse - IPC Server handler 0 on 62154, call
>> > log(containerId,
>> > > >>>> timeout), rpc version=2, client version=201208081755,
>> > > >>>> methodsFingerPrint=-1300451462 from 192.168.87.125:62156
>> Call#137
>> > > >>> Retry#0:
>> > > >>>> output error
>> > > >>>> 2018-06-19 21:34:30,808 [main] ERROR stram.RecoverableRpcProxy
>> > invoke
>> > > -
>> > > >>>> Giving up RPC connection recovery after 506 ms
>> > > >>>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
>> > > >>>> 192.168.87.125
>> > > >>>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> > > >>>> java.net.SocketTimeoutException: 500 millis timeout while
>> waiting
>> > for
>> > > >>>> channel to be ready for read. ch :
>> > > >>>> java.nio.channels.SocketChannel[connected local=/
>> > 192.168.87.125:62159
>> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details
>> > see:
>> > > >>>> http://wiki.apache.org/hadoop/SocketTimeout
>> > > >>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> > > >> Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> > > >>>> at
>> java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> > > >>>> at org.apache.hadoop.net
>> > .NetUtils.wrapWithMessage(NetUtils.java:791)
>> > > >>>> at org.apache.hadoop.net
>> .NetUtils.wrapException(NetUtils.java:750)
>> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> > > >>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> > > >>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:596)
>> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> > > >>>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> > > >>>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> > > >>>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> > > >>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> > > >>>> at
>> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> > > >>>> at
>> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> > > >>>> at
>> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> > > >>>> at
>> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> > > >>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> > > >>>> at
>> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> > > >>>> at
>> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> > > >>>> at
>> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> > > >>>> at
>> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> > > >>>> at
>> > > org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> > > >>>> at
>> > > >>>>
>> > > >>
>> > >
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> > > >>>> Caused by: java.net.SocketTimeoutException: 500 millis timeout
>> > while
>> > > >>>> waiting for channel to be ready for read. ch :
>> > > >>>> java.nio.channels.SocketChannel[connected local=/
>> > 192.168.87.125:62159
>> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> > > >>>> at
>> > > >>>> org.apache.hadoop.net
>> > > >>>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> > > >>>> at org.apache.hadoop.net
>> > > >>>> .SocketInputStream.read(SocketInputStream.java:161)
>> > > >>>> at org.apache.hadoop.net
>> > > >>>> .SocketInputStream.read(SocketInputStream.java:131)
>> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> > > >>>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> > > >>>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> > > >>>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> > > >>>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> > > >>>> 2018-06-19 21:34:31,307 [IPC Server handler 0 on 62154] WARN
>> > > >> ipc.Server
>> > > >>>> processResponse - IPC Server handler 0 on 62154, call
>> > log(containerId,
>> > > >>>> timeout), rpc version=2, client version=201208081755,
>> > > >>>> methodsFingerPrint=-1300451462 from 192.168.87.125:62159
>> Call#141
>> > > >>> Retry#0:
>> > > >>>> output error
>> > > >>>> 2018-06-19 21:34:31,327 [main] WARN  stram.RecoverableRpcProxy
>> > invoke
>> > > -
>> > > >>> RPC
>> > > >>>> failure, will retry after 100 ms (remaining 995 ms)
>> > > >>>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
>> > > >>>> 192.168.87.125
>> > > >>>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> > > >>>> java.net.SocketTimeoutException: 500 millis timeout while
>> waiting
>> > for
>> > > >>>> channel to be ready for read. ch :
>> > > >>>> java.nio.channels.SocketChannel[connected local=/
>> > 192.168.87.125:62160
>> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details
>> > see:
>> > > >>>> http://wiki.apache.org/hadoop/SocketTimeout
>> > > >>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> > > >> Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> > > >>>> at
>> java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> > > >>>> at org.apache.hadoop.net
>> > .NetUtils.wrapWithMessage(NetUtils.java:791)
>> > > >>>> at org.apache.hadoop.net
>> .NetUtils.wrapException(NetUtils.java:750)
>> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> > > >>>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
>> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> > > >>>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
>> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> > > >>>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> > > >>>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> > > >>>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> > > >>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> > > >>>> at
>> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> > > >>>> at
>> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> > > >>>> at
>> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> > > >>>> at
>> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> > > >>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> > > >>>> at
>> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> > > >>>> at
>> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> > > >>>> at
>> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> > > >>>> at
>> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> > > >>>> at
>> > > org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> > > >>>> at
>> > > >>>>
>> > > >>
>> > >
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> > > >>>> Caused by: java.net.SocketTimeoutException: 500 millis timeout
>> > while
>> > > >>>> waiting for channel to be ready for read. ch :
>> > > >>>> java.nio.channels.SocketChannel[connected local=/
>> > 192.168.87.125:62160
>> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> > > >>>> at
>> > > >>>> org.apache.hadoop.net
>> > > >>>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> > > >>>> at org.apache.hadoop.net
>> > > >>>> .SocketInputStream.read(SocketInputStream.java:161)
>> > > >>>> at org.apache.hadoop.net
>> > > >>>> .SocketInputStream.read(SocketInputStream.java:131)
>> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> > > >>>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> > > >>>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> > > >>>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> > > >>>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> > > >>>> 2018-06-19 21:34:31,931 [main] WARN  stram.RecoverableRpcProxy
>> > invoke
>> > > -
>> > > >>> RPC
>> > > >>>> failure, will retry after 100 ms (remaining 391 ms)
>> > > >>>> java.net.SocketTimeoutException: Call From macbook-pro-6.lan/
>> > > >>>> 192.168.87.125
>> > > >>>> to macbook-pro-6.lan:62154 failed on socket timeout exception:
>> > > >>>> java.net.SocketTimeoutException: 500 millis timeout while
>> waiting
>> > for
>> > > >>>> channel to be ready for read. ch :
>> > > >>>> java.nio.channels.SocketChannel[connected local=/
>> > 192.168.87.125:62161
>> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more details
>> > see:
>> > > >>>> http://wiki.apache.org/hadoop/SocketTimeout
>> > > >>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> > > >> Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> > > >>>> at
>> java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>> > > >>>> at org.apache.hadoop.net
>> > .NetUtils.wrapWithMessage(NetUtils.java:791)
>> > > >>>> at org.apache.hadoop.net
>> .NetUtils.wrapException(NetUtils.java:750)
>> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
>> > > >>>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
>> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
>> > > >>>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
>> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
>> > > >>>> at org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
>> > > >>>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
>> > > >>>> at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
>> > > >>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> > > >>>> at
>> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> > > >>>> at
>> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> > > >>>> at
>> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> > > >>>> at
>> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:127)
>> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:26)
>> > > >>>> at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
>> > > >>>> at
>> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
>> > > >>>> at
>> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
>> > > >>>> at
>> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
>> > > >>>> at
>> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
>> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
>> > > >>>> at
>> > > org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
>> > > >>>> at
>> > > >>>>
>> > > >>
>> > >
>> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
>> > > >>>> Caused by: java.net.SocketTimeoutException: 500 millis timeout
>> > while
>> > > >>>> waiting for channel to be ready for read. ch :
>> > > >>>> java.nio.channels.SocketChannel[connected local=/
>> > 192.168.87.125:62161
>> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]
>> > > >>>> at
>> > > >>>> org.apache.hadoop.net
>> > > >>>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
>> > > >>>> at org.apache.hadoop.net
>> > > >>>> .SocketInputStream.read(SocketInputStream.java:161)
>> > > >>>> at org.apache.hadoop.net
>> > > >>>> .SocketInputStream.read(SocketInputStream.java:131)
>> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
>> > > >>>> at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
>> > > >>>> at java.io.BufferedInputStream.read(BufferedInputStream.java:265)
>> > > >>>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
>> > > >>>> at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> > > >>>> 2018-06-19 21:34:32,310 [IPC Server handler 0 on 62154] WARN
>> > > >> ipc.Server
>> > > >>>> processResponse - IPC Server handler 0 on 62154, call
>> > > >>>> reportError(containerId, null, timeout, null), rpc version=2,
>> client
>> > > >>>> version=201208081755, methodsFingerPrint=-1300451462 from
>> > > >>>> 192.168.87.125:62160 Call#142 Retry#0: output error
>> > > >>>> 2018-06-19 21:34:32,512 [main] INFO  stram.FSRecoveryHandler
>> > rotateLog
>> > > >> -
>> > > >>>> Creating
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
>> > > >>>> 2018-06-19 21:34:32,628 [main] INFO  stram.FSRecoveryHandler
>> > rotateLog
>> > > >> -
>> > > >>>> Creating
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app1/recovery/log
>> > > >>>> 2018-06-19 21:34:32,696 [main] INFO  stram.FSRecoveryHandler
>> > rotateLog
>> > > >> -
>> > > >>>> Creating
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
>> > > >>>> 2018-06-19 21:34:32,698 [main] INFO  stram.StramClient
>> > > >> copyInitialState -
>> > > >>>> Copying initial state took 32 ms
>> > > >>>> 2018-06-19 21:34:32,799 [main] INFO  stram.FSRecoveryHandler
>> > rotateLog
>> > > >> -
>> > > >>>> Creating
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app2/recovery/log
>> > > >>>> 2018-06-19 21:34:32,850 [main] INFO  stram.FSRecoveryHandler
>> > rotateLog
>> > > >> -
>> > > >>>> Creating
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
>> > > >>>> 2018-06-19 21:34:32,851 [main] INFO  stram.StramClient
>> > > >> copyInitialState -
>> > > >>>> Copying initial state took 28 ms
>> > > >>>> 2018-06-19 21:34:32,955 [main] INFO  stram.FSRecoveryHandler
>> > rotateLog
>> > > >> -
>> > > >>>> Creating
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithSyncAgent/app3/recovery/log
>> > > >>>> 2018-06-19 21:34:32,976 [main] WARN  physical.PhysicalPlan
>> <init> -
>> > > >>>> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container
>> > > >> without
>> > > >>>> locality contraint due to insufficient resources.
>> > > >>>> 2018-06-19 21:34:32,977 [main] WARN  physical.PhysicalPlan
>> <init> -
>> > > >>>> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container
>> > > >> without
>> > > >>>> locality contraint due to insufficient resources.
>> > > >>>> 2018-06-19 21:34:32,977 [main] WARN  physical.PhysicalPlan
>> <init> -
>> > > >>>> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container
>> > > >> without
>> > > >>>> locality contraint due to insufficient resources.
>> > > >>>> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan
>> <init> -
>> > > >>>> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container
>> > > >> without
>> > > >>>> locality contraint due to insufficient resources.
>> > > >>>> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan
>> <init> -
>> > > >>>> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container
>> > > >> without
>> > > >>>> locality contraint due to insufficient resources.
>> > > >>>> 2018-06-19 21:34:33,166 [main] WARN  physical.PhysicalPlan
>> <init> -
>> > > >>>> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container
>> > > >> without
>> > > >>>> locality contraint due to insufficient resources.
>> > > >>>> 2018-06-19 21:34:33,338 [main] INFO  util.AsyncFSStorageAgent
>> save -
>> > > >>> using
>> > > >>
>> > /Users/mbossert/testIdea/apex-core/engine/target/chkp2603930902590449397
>> > > >>> as
>> > > >>>> the basepath for checkpointing.
>> > > >>>> 2018-06-19 21:34:33,436 [main] INFO  stram.FSRecoveryHandler
>> > rotateLog
>> > > >> -
>> > > >>>> Creating
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
>> > > >>>> 2018-06-19 21:34:33,505 [main] INFO  stram.FSRecoveryHandler
>> > rotateLog
>> > > >> -
>> > > >>>> Creating
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app1/recovery/log
>> > > >>>> 2018-06-19 21:34:33,553 [main] INFO  stram.FSRecoveryHandler
>> > rotateLog
>> > > >> -
>> > > >>>> Creating
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
>> > > >>>> 2018-06-19 21:34:33,554 [main] INFO  stram.StramClient
>> > > >> copyInitialState -
>> > > >>>> Copying initial state took 22 ms
>> > > >>>> 2018-06-19 21:34:33,642 [main] INFO  stram.FSRecoveryHandler
>> > rotateLog
>> > > >> -
>> > > >>>> Creating
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app2/recovery/log
>> > > >>>> 2018-06-19 21:34:33,690 [main] INFO  stram.FSRecoveryHandler
>> > rotateLog
>> > > >> -
>> > > >>>> Creating
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
>> > > >>>> 2018-06-19 21:34:33,691 [main] INFO  stram.StramClient
>> > > >> copyInitialState -
>> > > >>>> Copying initial state took 29 ms
>> > > >>>> 2018-06-19 21:34:33,805 [main] INFO  stram.FSRecoveryHandler
>> > rotateLog
>> > > >> -
>> > > >>>> Creating
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> target/com.datatorrent.stram.StramRecoveryTest/testRestartAppWithAsyncAgent/app3/recovery/log
>> > > >>>> 2018-06-19 21:34:33,830 [main] WARN  physical.PhysicalPlan
>> <init> -
>> > > >>>> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container
>> > > >> without
>> > > >>>> locality contraint due to insufficient resources.
>> > > >>>> 2018-06-19 21:34:33,830 [main] WARN  physical.PhysicalPlan
>> <init> -
>> > > >>>> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container
>> > > >> without
>> > > >>>> locality contraint due to insufficient resources.
>> > > >>>> 2018-06-19 21:34:33,831 [main] WARN  physical.PhysicalPlan
>> <init> -
>> > > >>>> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container
>> > > >> without
>> > > >>>> locality contraint due to insufficient resources.
>> > > >>>> 2018-06-19 21:34:33,831 [main] INFO  util.AsyncFSStorageAgent
>> save -
>> > > >>> using
>> > > >>
>> > /Users/mbossert/testIdea/apex-core/engine/target/chkp1878353095301008843
>> > > >>> as
>> > > >>>> the basepath for checkpointing.
>> > > >>>> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan
>> <init> -
>> > > >>>> Operator PTOperator[id=3,name=o2,state=INACTIVE] shares container
>> > > >> without
>> > > >>>> locality contraint due to insufficient resources.
>> > > >>>> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan
>> <init> -
>> > > >>>> Operator PTOperator[id=4,name=o2,state=INACTIVE] shares container
>> > > >> without
>> > > >>>> locality contraint due to insufficient resources.
>> > > >>>> 2018-06-19 21:34:34,077 [main] WARN  physical.PhysicalPlan
>> <init> -
>> > > >>>> Operator PTOperator[id=5,name=o3,state=INACTIVE] shares container
>> > > >> without
>> > > >>>> locality contraint due to insufficient resources.
>> > > >>>> 2018-06-19 21:34:34,077 [main] INFO  util.AsyncFSStorageAgent
>> save -
>> > > >>> using
>> > > >>
>> > /Users/mbossert/testIdea/apex-core/engine/target/chkp7337975615972280003
>> > > >>> as
>> > > >>>> the basepath for checkpointing.
>> > > >>>> Tests run: 8, Failures: 1, Errors: 0, Skipped: 0, Time elapsed:
>> > 6.143
>> > > >> sec
>> > > >>>> <<< FAILURE! - in com.datatorrent.stram.StramRecoveryTest
>> > > >>>> testWriteAheadLog(com.datatorrent.stram.StramRecoveryTest)  Time
>> > > >> elapsed:
>> > > >>>> 0.111 sec  <<< FAILURE!
>> > > >>>> java.lang.AssertionError: flush count expected:<1> but was:<2>
>> > > >>>> at
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> com.datatorrent.stram.StramRecoveryTest.testWriteAheadLog(StramRecoveryTest.java:326)
>> > > >>>>
>> > > >>>> Running com.datatorrent.stram.CustomControlTupleTest
>> > > >>>> 2018-06-19 21:34:49,308 [main] INFO  util.AsyncFSStorageAgent
>> save -
>> > > >>> using
>> > > >>
>> > /Users/mbossert/testIdea/apex-core/engine/target/chkp1213673348429546877
>> > > >>> as
>> > > >>>> the basepath for checkpointing.
>> > > >>>> 2018-06-19 21:34:49,451 [main] INFO  storage.DiskStorage <init> -
>> > > using
>> > > >>>> /Users/mbossert/testIdea/apex-core/engine/target as the basepath
>> for
>> > > >>>> spooling.
>> > > >>>> 2018-06-19 21:34:49,451 [ProcessWideEventLoop] INFO
>> server.Server
>> > > >>>> registered - Server started listening at /0:0:0:0:0:0:0:0:62181
>> > > >>>> 2018-06-19 21:34:49,451 [main] INFO  stram.StramLocalCluster run
>> -
>> > > >> Buffer
>> > > >>>> server started: localhost:62181
>> > > >>>> 2018-06-19 21:34:49,452 [container-0] INFO
>> stram.StramLocalCluster
>> > > >> run -
>> > > >>>> Started container container-0
>> > > >>>> 2018-06-19 21:34:49,452 [container-1] INFO
>> stram.StramLocalCluster
>> > > >> run -
>> > > >>>> Started container container-1
>> > > >>>> 2018-06-19 21:34:49,452 [container-2] INFO
>> stram.StramLocalCluster
>> > > >> run -
>> > > >>>> Started container container-2
>> > > >>>> 2018-06-19 21:34:49,452 [container-1] INFO
>> stram.StramLocalCluster
>> > > >> log -
>> > > >>>> container-1 msg: [container-1] Entering heartbeat loop..
>> > > >>>> 2018-06-19 21:34:49,452 [container-0] INFO
>> stram.StramLocalCluster
>> > > >> log -
>> > > >>>> container-0 msg: [container-0] Entering heartbeat loop..
>> > > >>>> 2018-06-19 21:34:49,452 [container-2] INFO
>> stram.StramLocalCluster
>> > > >> log -
>> > > >>>> container-2 msg: [container-2] Entering heartbeat loop..
>> > > >>>> 2018-06-19 21:34:50,460 [container-2] INFO
>> > engine.StreamingContainer
>> > > >>>> processHeartbeatResponse - Deploy request:
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> [OperatorDeployInfo[id=3,name=receiver,type=GENERIC,checkpoint={ffffffffffffffff,
>> > > >>>> 0,
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> 0},inputs=[OperatorDeployInfo.InputDeployInfo[portName=input,streamId=ProcessorToReceiver,sourceNodeId=2,sourcePortName=output,locality=<null>,partitionMask=0,partitionKeys=<null>]],outputs=[]]]
>> > > >>>> 2018-06-19 21:34:50,460 [container-0] INFO
>> > engine.StreamingContainer
>> > > >>>> processHeartbeatResponse - Deploy request:
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> [OperatorDeployInfo[id=1,name=randomGenerator,type=INPUT,checkpoint={ffffffffffffffff,
>> > > >>>> 0,
>> > > >>>>
>> > > >>>>
>> > > >>
>> > >
>> >
>> 0},inputs=[],outputs=[OperatorDeployInfo.OutputDeployInfo[portName=out,streamId=genToProcessor,bufferS
>
>
>
> --
>
> M. Aaron Bossert
> (571) 242-4021
> Punch Cyber Analytics Group
>
>
>

-- 

M. Aaron Bossert
(571) 242-4021
Punch Cyber Analytics Group

Reply via email to