> 
>> 2.   I notice that once I start ssc.start(), my stream starts processing and
>> continues indefinitely...even if I close the socket on the server end (I'm
>> using unix command "nc" to mimic a server as explained in the streaming
>> programming guide .)  Can I tell my stream to detect if it's lost a
>> connection and therefore stop executing?  (Or even better, to attempt to
>> re-establish the connection?)
>> 
> 
> 
> Currently, not yet. But I am aware of this and this behavior will be
> improved in the future.

Now i understand why out spark streaming job starts to generate zero sized rdds 
from kafkainput, 
when one worker get OOM or crashes.

And we can’t detect it! Great. So spark streaming just doesn’t suite yet for 
24/7 operation =\

Reply via email to