Hi all,

we have two environments for spark streaming job, which consumes Kafka 
topic to do calculation.

Now in one environment, spark streaming job consume an non-standard data 
from kafka and throw an excepiton(not catch it in code), then the 
sreaming job is down.

But in another environment, spark streaming job also throw an exception( 
same exception message in log file), but the streaming job is still 
running and consume other data continuously.

Is there some parameters or configuration for this problem? Why one job 
is down and another job is still running.



---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to