Thanks!

On Wed, Jun 20, 2018 at 2:28 PM Pramod Immaneni <pramod.imman...@gmail.com>
wrote:

> I believe it is StatsListenerWithContext
>
> On Wed, Jun 20, 2018 at 11:25 AM Aaron Bossert <aa...@punchcyber.com>
> wrote:
>
> > Sorry, a bit off topic, but what replaces
> > com.datatorrent.api.StatsListener? It is marked as deprecated.
> >
> > On Wed, Jun 20, 2018 at 2:12 PM Aaron Bossert <aa...@punchcyber.com>
> > wrote:
> >
> > > I agree.  I am porting a prototype that I built using Storm to Apex.
> The
> > > runtime config changes were the impetus for the switch.  I need to be
> > able
> > > to adjust settings, compute resources, and topology components (add,
> > > remove, modify) on the fly...Implementing those features from scratch
> in
> > > Storm would have been a massive undertaking.  Currently, there are
> Apache
> > > Metron and Spot, which are backed by Hortonworks and Cloudera,
> > > respectively...this will be a competitor to both of them.  The project
> > will
> > > be open-sourced once sufficient progress is made.
> > >
> > > On Wed, Jun 20, 2018 at 2:02 PM Pramod Immaneni <
> > pramod.imman...@gmail.com>
> > > wrote:
> > >
> > >> That would be awesome, it would be a good use case for the platform.
> > >>
> > >> On Wed, Jun 20, 2018 at 10:54 AM Aaron Bossert <aa...@punchcyber.com>
> > >> wrote:
> > >>
> > >> > I'm with you.  I'm hoping to have it underpin a fairly large network
> > >> > security pipeline...
> > >> >
> > >> > On Wed, Jun 20, 2018 at 1:29 PM Vlad Rozov <vro...@apache.org>
> wrote:
> > >> >
> > >> > > +1. Apex does not have as large community as other Apache project,
> > but
> > >> > > let's try to build it.
> > >> > >
> > >> > > Thank you,
> > >> > >
> > >> > > Vlad
> > >> > >
> > >> > > On 6/20/18 10:24, Pramod Immaneni wrote:
> > >> > > > Aaron,
> > >> > > >
> > >> > > > Your concerns are legitimate, the number of folks contributing
> in
> > >> terms
> > >> > > of
> > >> > > > code commits has reduced quite a bit. I do however still see
> > >> interest
> > >> > > from
> > >> > > > ex-contributors and committers based on their participation in
> the
> > >> > > > discussions on dev list when important topics come up. It is up
> to
> > >> be
> > >> > > seen
> > >> > > > how the recent changes with DataTorrent will play out and if
> > >> > > ex-DataTorrent
> > >> > > > folks will contribute in the upcoming months. There was a new
> > >> > initiative
> > >> > > > started by an ex-DataTorrent employee a few days back on adding
> a
> > >> basic
> > >> > > UI
> > >> > > > to Apex so maybe there is hope.
> > >> > > >
> > >> > > > Having said that a few of us long timers have stuck it out
> through
> > >> > thick
> > >> > > > and thin and willing to help whenever it is possible.
> > >> > > >
> > >> > > > Thanks
> > >> > > >
> > >> > > > On Wed, Jun 20, 2018 at 9:19 AM Aaron Bossert <
> > aa...@punchcyber.com
> > >> >
> > >> > > wrote:
> > >> > > >
> > >> > > >> Gentlemen,
> > >> > > >>
> > >> > > >> I am working at the start of a fairly large project.  I have
> some
> > >> > > questions
> > >> > > >> related to the general health of Apex...need to get a warm and
> > >> fuzzy
> > >> > > >> feeling that the project is not going to die on the vine as it
> > >> were.
> > >> > > >>
> > >> > > >> I am seeing the volume of commits and contributor activity
> > dropped
> > >> off
> > >> > > >> significantly since early 2016 and there has been a drop again
> > >> after
> > >> > > >> Datatorrent folded...What is your sense of the project?  I
> really
> > >> like
> > >> > > the
> > >> > > >> framework and definitely would prefer to use it as well as
> > >> contribute
> > >> > > >> back...just want to make sure I am not going to have work on it
> > >> solo
> > >> > or
> > >> > > >> worse, end up having to switch to something else later...
> > >> > > >>
> > >> > > >> Your thoughts?
> > >> > > >>
> > >> > > >> Aaron
> > >> > > >>
> > >> > > >> On Wed, Jun 20, 2018 at 12:10 PM Pramod Immaneni <
> > >> > > >> pramod.imman...@gmail.com>
> > >> > > >> wrote:
> > >> > > >>
> > >> > > >>> There are hadoop IPC calls are failing possibly because of its
> > >> > reliance
> > >> > > >> on
> > >> > > >>> kryo for serializing the payload and there is some
> > incompatibility
> > >> > with
> > >> > > >> the
> > >> > > >>> new version. I will dig in more to see what is going on.
> > >> > > >>>
> > >> > > >>> On Tue, Jun 19, 2018 at 6:54 PM Aaron Bossert <
> > >> aa...@punchcyber.com>
> > >> > > >>> wrote:
> > >> > > >>>
> > >> > > >>>> Pramod,
> > >> > > >>>>
> > >> > > >>>> Thanks for taking the time to help!
> > >> > > >>>>
> > >> > > >>>> Here is the output (just failed parts) when running full
> > install
> > >> > > (clean
> > >> > > >>>> install -X) on the Master branch:
> > >> > > >>>>
> > >> > > >>>> Running com.datatorrent.stram.StramRecoveryTest
> > >> > > >>>> 2018-06-19 21:34:28,137 [main] INFO  stram.StramRecoveryTest
> > >> > > >>>> testRpcFailover - Mock server listening at macbook-pro-6.lan/
> > >> > > >>>> 192.168.87.125:62154
> > >> > > >>>> 2018-06-19 21:34:28,678 [main] ERROR
> stram.RecoverableRpcProxy
> > >> > invoke
> > >> > > -
> > >> > > >>>> Giving up RPC connection recovery after 507 ms
> > >> > > >>>> java.net.SocketTimeoutException: Call From
> macbook-pro-6.lan/
> > >> > > >>>> 192.168.87.125
> > >> > > >>>> to macbook-pro-6.lan:62154 failed on socket timeout
> exception:
> > >> > > >>>> java.net.SocketTimeoutException: 500 millis timeout while
> > >> waiting
> > >> > for
> > >> > > >>>> channel to be ready for read. ch :
> > >> > > >>>> java.nio.channels.SocketChannel[connected local=/
> > >> > 192.168.87.125:62155
> > >> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more
> > details
> > >> > see:
> > >> > > >>>> http://wiki.apache.org/hadoop/SocketTimeout
> > >> > > >>>> at
> > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > >> > > >> Method)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > >> > > >>>> at
> > >> java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > >> > > >>>> at org.apache.hadoop.net
> > >> > .NetUtils.wrapWithMessage(NetUtils.java:791)
> > >> > > >>>> at org.apache.hadoop.net
> > >> .NetUtils.wrapException(NetUtils.java:750)
> > >> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > >> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > >> > > >>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
> > >> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > >> > > >>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:561)
> > >> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > >> > > >>>> at
> org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > >> > > >>>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > >> > > >>>> at
> > org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > >> > > >>>> at
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > >> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > >> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:127)
> > >> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:26)
> > >> > > >>>> at
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > >> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > >> > > >>>> at
> > >> > >
> org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >>
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > >> > > >>>> Caused by: java.net.SocketTimeoutException: 500 millis
> timeout
> > >> > while
> > >> > > >>>> waiting for channel to be ready for read. ch :
> > >> > > >>>> java.nio.channels.SocketChannel[connected local=/
> > >> > 192.168.87.125:62155
> > >> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]
> > >> > > >>>> at
> > >> > > >>>> org.apache.hadoop.net
> > >> > > >>>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > >> > > >>>> at org.apache.hadoop.net
> > >> > > >>>> .SocketInputStream.read(SocketInputStream.java:161)
> > >> > > >>>> at org.apache.hadoop.net
> > >> > > >>>> .SocketInputStream.read(SocketInputStream.java:131)
> > >> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > >> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > >> > > >>>> at
> > java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > >> > > >>>> at
> > java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > >> > > >>>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > >> > > >>>> at
> org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > >> > > >>>> 2018-06-19 21:34:29,178 [IPC Server handler 0 on 62154] WARN
> > >> > > >> ipc.Server
> > >> > > >>>> processResponse - IPC Server handler 0 on 62154, call
> > >> > log(containerId,
> > >> > > >>>> timeout), rpc version=2, client version=201208081755,
> > >> > > >>>> methodsFingerPrint=-1300451462 from 192.168.87.125:62155
> > >> Call#136
> > >> > > >>> Retry#0:
> > >> > > >>>> output error
> > >> > > >>>> 2018-06-19 21:34:29,198 [main] WARN
> stram.RecoverableRpcProxy
> > >> > invoke
> > >> > > -
> > >> > > >>> RPC
> > >> > > >>>> failure, will retry after 100 ms (remaining 994 ms)
> > >> > > >>>> java.net.SocketTimeoutException: Call From
> macbook-pro-6.lan/
> > >> > > >>>> 192.168.87.125
> > >> > > >>>> to macbook-pro-6.lan:62154 failed on socket timeout
> exception:
> > >> > > >>>> java.net.SocketTimeoutException: 500 millis timeout while
> > >> waiting
> > >> > for
> > >> > > >>>> channel to be ready for read. ch :
> > >> > > >>>> java.nio.channels.SocketChannel[connected local=/
> > >> > 192.168.87.125:62156
> > >> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more
> > details
> > >> > see:
> > >> > > >>>> http://wiki.apache.org/hadoop/SocketTimeout
> > >> > > >>>> at
> > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > >> > > >> Method)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > >> > > >>>> at
> > >> java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > >> > > >>>> at org.apache.hadoop.net
> > >> > .NetUtils.wrapWithMessage(NetUtils.java:791)
> > >> > > >>>> at org.apache.hadoop.net
> > >> .NetUtils.wrapException(NetUtils.java:750)
> > >> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > >> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > >> > > >>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
> > >> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > >> > > >>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> > >> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > >> > > >>>> at
> org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > >> > > >>>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > >> > > >>>> at
> > org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > >> > > >>>> at
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > >> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > >> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:127)
> > >> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:26)
> > >> > > >>>> at
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > >> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > >> > > >>>> at
> > >> > >
> org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >>
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > >> > > >>>> Caused by: java.net.SocketTimeoutException: 500 millis
> timeout
> > >> > while
> > >> > > >>>> waiting for channel to be ready for read. ch :
> > >> > > >>>> java.nio.channels.SocketChannel[connected local=/
> > >> > 192.168.87.125:62156
> > >> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]
> > >> > > >>>> at
> > >> > > >>>> org.apache.hadoop.net
> > >> > > >>>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > >> > > >>>> at org.apache.hadoop.net
> > >> > > >>>> .SocketInputStream.read(SocketInputStream.java:161)
> > >> > > >>>> at org.apache.hadoop.net
> > >> > > >>>> .SocketInputStream.read(SocketInputStream.java:131)
> > >> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > >> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > >> > > >>>> at
> > java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > >> > > >>>> at
> > java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > >> > > >>>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > >> > > >>>> at
> org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > >> > > >>>> 2018-06-19 21:34:29,806 [main] WARN
> stram.RecoverableRpcProxy
> > >> > invoke
> > >> > > -
> > >> > > >>> RPC
> > >> > > >>>> failure, will retry after 100 ms (remaining 386 ms)
> > >> > > >>>> java.net.SocketTimeoutException: Call From
> macbook-pro-6.lan/
> > >> > > >>>> 192.168.87.125
> > >> > > >>>> to macbook-pro-6.lan:62154 failed on socket timeout
> exception:
> > >> > > >>>> java.net.SocketTimeoutException: 500 millis timeout while
> > >> waiting
> > >> > for
> > >> > > >>>> channel to be ready for read. ch :
> > >> > > >>>> java.nio.channels.SocketChannel[connected local=/
> > >> > 192.168.87.125:62157
> > >> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more
> > details
> > >> > see:
> > >> > > >>>> http://wiki.apache.org/hadoop/SocketTimeout
> > >> > > >>>> at
> > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > >> > > >> Method)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > >> > > >>>> at
> > >> java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > >> > > >>>> at org.apache.hadoop.net
> > >> > .NetUtils.wrapWithMessage(NetUtils.java:791)
> > >> > > >>>> at org.apache.hadoop.net
> > >> .NetUtils.wrapException(NetUtils.java:750)
> > >> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > >> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > >> > > >>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
> > >> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > >> > > >>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:575)
> > >> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > >> > > >>>> at
> org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > >> > > >>>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > >> > > >>>> at
> > org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > >> > > >>>> at
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > >> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > >> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:127)
> > >> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:26)
> > >> > > >>>> at
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > >> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > >> > > >>>> at
> > >> > >
> org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >>
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > >> > > >>>> Caused by: java.net.SocketTimeoutException: 500 millis
> timeout
> > >> > while
> > >> > > >>>> waiting for channel to be ready for read. ch :
> > >> > > >>>> java.nio.channels.SocketChannel[connected local=/
> > >> > 192.168.87.125:62157
> > >> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]
> > >> > > >>>> at
> > >> > > >>>> org.apache.hadoop.net
> > >> > > >>>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > >> > > >>>> at org.apache.hadoop.net
> > >> > > >>>> .SocketInputStream.read(SocketInputStream.java:161)
> > >> > > >>>> at org.apache.hadoop.net
> > >> > > >>>> .SocketInputStream.read(SocketInputStream.java:131)
> > >> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > >> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > >> > > >>>> at
> > java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > >> > > >>>> at
> > java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > >> > > >>>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > >> > > >>>> at
> org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > >> > > >>>> 2018-06-19 21:34:30,180 [IPC Server handler 0 on 62154] WARN
> > >> > > >> ipc.Server
> > >> > > >>>> processResponse - IPC Server handler 0 on 62154, call
> > >> > log(containerId,
> > >> > > >>>> timeout), rpc version=2, client version=201208081755,
> > >> > > >>>> methodsFingerPrint=-1300451462 from 192.168.87.125:62156
> > >> Call#137
> > >> > > >>> Retry#0:
> > >> > > >>>> output error
> > >> > > >>>> 2018-06-19 21:34:30,808 [main] ERROR
> stram.RecoverableRpcProxy
> > >> > invoke
> > >> > > -
> > >> > > >>>> Giving up RPC connection recovery after 506 ms
> > >> > > >>>> java.net.SocketTimeoutException: Call From
> macbook-pro-6.lan/
> > >> > > >>>> 192.168.87.125
> > >> > > >>>> to macbook-pro-6.lan:62154 failed on socket timeout
> exception:
> > >> > > >>>> java.net.SocketTimeoutException: 500 millis timeout while
> > >> waiting
> > >> > for
> > >> > > >>>> channel to be ready for read. ch :
> > >> > > >>>> java.nio.channels.SocketChannel[connected local=/
> > >> > 192.168.87.125:62159
> > >> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more
> > details
> > >> > see:
> > >> > > >>>> http://wiki.apache.org/hadoop/SocketTimeout
> > >> > > >>>> at
> > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > >> > > >> Method)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > >> > > >>>> at
> > >> java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > >> > > >>>> at org.apache.hadoop.net
> > >> > .NetUtils.wrapWithMessage(NetUtils.java:791)
> > >> > > >>>> at org.apache.hadoop.net
> > >> .NetUtils.wrapException(NetUtils.java:750)
> > >> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > >> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > >> > > >>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
> > >> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > >> > > >>>> at com.sun.proxy.$Proxy138.log(Unknown Source)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:596)
> > >> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > >> > > >>>> at
> org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > >> > > >>>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > >> > > >>>> at
> > org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > >> > > >>>> at
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > >> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > >> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:127)
> > >> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:26)
> > >> > > >>>> at
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > >> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > >> > > >>>> at
> > >> > >
> org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >>
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > >> > > >>>> Caused by: java.net.SocketTimeoutException: 500 millis
> timeout
> > >> > while
> > >> > > >>>> waiting for channel to be ready for read. ch :
> > >> > > >>>> java.nio.channels.SocketChannel[connected local=/
> > >> > 192.168.87.125:62159
> > >> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]
> > >> > > >>>> at
> > >> > > >>>> org.apache.hadoop.net
> > >> > > >>>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > >> > > >>>> at org.apache.hadoop.net
> > >> > > >>>> .SocketInputStream.read(SocketInputStream.java:161)
> > >> > > >>>> at org.apache.hadoop.net
> > >> > > >>>> .SocketInputStream.read(SocketInputStream.java:131)
> > >> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > >> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > >> > > >>>> at
> > java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > >> > > >>>> at
> > java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > >> > > >>>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > >> > > >>>> at
> org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > >> > > >>>> 2018-06-19 21:34:31,307 [IPC Server handler 0 on 62154] WARN
> > >> > > >> ipc.Server
> > >> > > >>>> processResponse - IPC Server handler 0 on 62154, call
> > >> > log(containerId,
> > >> > > >>>> timeout), rpc version=2, client version=201208081755,
> > >> > > >>>> methodsFingerPrint=-1300451462 from 192.168.87.125:62159
> > >> Call#141
> > >> > > >>> Retry#0:
> > >> > > >>>> output error
> > >> > > >>>> 2018-06-19 21:34:31,327 [main] WARN
> stram.RecoverableRpcProxy
> > >> > invoke
> > >> > > -
> > >> > > >>> RPC
> > >> > > >>>> failure, will retry after 100 ms (remaining 995 ms)
> > >> > > >>>> java.net.SocketTimeoutException: Call From
> macbook-pro-6.lan/
> > >> > > >>>> 192.168.87.125
> > >> > > >>>> to macbook-pro-6.lan:62154 failed on socket timeout
> exception:
> > >> > > >>>> java.net.SocketTimeoutException: 500 millis timeout while
> > >> waiting
> > >> > for
> > >> > > >>>> channel to be ready for read. ch :
> > >> > > >>>> java.nio.channels.SocketChannel[connected local=/
> > >> > 192.168.87.125:62160
> > >> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more
> > details
> > >> > see:
> > >> > > >>>> http://wiki.apache.org/hadoop/SocketTimeout
> > >> > > >>>> at
> > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > >> > > >> Method)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > >> > > >>>> at
> > >> java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > >> > > >>>> at org.apache.hadoop.net
> > >> > .NetUtils.wrapWithMessage(NetUtils.java:791)
> > >> > > >>>> at org.apache.hadoop.net
> > >> .NetUtils.wrapException(NetUtils.java:750)
> > >> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > >> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > >> > > >>>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > >> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> com.datatorrent.stram.RecoverableRpcProxy.invoke(RecoverableRpcProxy.java:157)
> > >> > > >>>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> com.datatorrent.stram.StramRecoveryTest.testRpcFailover(StramRecoveryTest.java:610)
> > >> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
> > >> > > >>>> at
> org.junit.rules.TestWatcher$1.evaluate(TestWatcher.java:55)
> > >> > > >>>> at org.junit.rules.RunRules.evaluate(RunRules.java:20)
> > >> > > >>>> at
> > org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
> > >> > > >>>> at
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > >> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > >> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:127)
> > >> > > >>>> at org.junit.runners.Suite.runChild(Suite.java:26)
> > >> > > >>>> at
> org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
> > >> > > >>>> at
> > >> org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
> > >> > > >>>> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> > >> > > >>>> at
> > >> > >
> org.apache.maven.surefire.junitcore.JUnitCore.run(JUnitCore.java:55)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:137)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeEager(JUnitCoreWrapper.java:107)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:83)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:75)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:161)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >>
> > org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121)
> > >> > > >>>> Caused by: java.net.SocketTimeoutException: 500 millis
> timeout
> > >> > while
> > >> > > >>>> waiting for channel to be ready for read. ch :
> > >> > > >>>> java.nio.channels.SocketChannel[connected local=/
> > >> > 192.168.87.125:62160
> > >> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]
> > >> > > >>>> at
> > >> > > >>>> org.apache.hadoop.net
> > >> > > >>>> .SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
> > >> > > >>>> at org.apache.hadoop.net
> > >> > > >>>> .SocketInputStream.read(SocketInputStream.java:161)
> > >> > > >>>> at org.apache.hadoop.net
> > >> > > >>>> .SocketInputStream.read(SocketInputStream.java:131)
> > >> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > >> > > >>>> at java.io.FilterInputStream.read(FilterInputStream.java:133)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:513)
> > >> > > >>>> at
> > java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
> > >> > > >>>> at
> > java.io.BufferedInputStream.read(BufferedInputStream.java:265)
> > >> > > >>>> at java.io.DataInputStream.readInt(DataInputStream.java:387)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1071)
> > >> > > >>>> at
> org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
> > >> > > >>>> 2018-06-19 21:34:31,931 [main] WARN
> stram.RecoverableRpcProxy
> > >> > invoke
> > >> > > -
> > >> > > >>> RPC
> > >> > > >>>> failure, will retry after 100 ms (remaining 391 ms)
> > >> > > >>>> java.net.SocketTimeoutException: Call From
> macbook-pro-6.lan/
> > >> > > >>>> 192.168.87.125
> > >> > > >>>> to macbook-pro-6.lan:62154 failed on socket timeout
> exception:
> > >> > > >>>> java.net.SocketTimeoutException: 500 millis timeout while
> > >> waiting
> > >> > for
> > >> > > >>>> channel to be ready for read. ch :
> > >> > > >>>> java.nio.channels.SocketChannel[connected local=/
> > >> > 192.168.87.125:62161
> > >> > > >>>> remote=macbook-pro-6.lan/192.168.87.125:62154]; For more
> > details
> > >> > see:
> > >> > > >>>> http://wiki.apache.org/hadoop/SocketTimeout
> > >> > > >>>> at
> > sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > >> > > >> Method)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> > >> > > >>>> at
> > >> java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> > >> > > >>>> at org.apache.hadoop.net
> > >> > .NetUtils.wrapWithMessage(NetUtils.java:791)
> > >> > > >>>> at org.apache.hadoop.net
> > >> .NetUtils.wrapException(NetUtils.java:750)
> > >> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> > >> > > >>>> at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:244)
> > >> > > >>>> at com.sun.proxy.$Proxy138.reportError(Unknown Source)
> > >> > > >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>>
> > >> > > >>
> > >> > >
> > >> >
> > >>
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > >> > > >>>> at java.lang.reflect.Method.invoke(Method.java:498)
> > >> > > >>>> at
> > >> > > >>>>
> > >> > > >>>



-- 

M. Aaron Bossert
(571) 242-4021
Punch Cyber Analytics Group

Reply via email to