Hi Guys,

Anyone faced this issue with spark ?

Why does it happen so in Spark Streaming that the executors are still shown
on the UI even when the worker is killed and not in the cluster.

This severely impacts my running jobs which takes too longer and the stages
failing with the exception

java.io.IOException: Failed to connect to --- (dead worker)

Is this a bug in Spark ??

Version is 1.4.0


Thanks,
Kundan

Reply via email to