Hi Spark users,

I'm seeing the below exceptions once in a while which causes tasks to fail
(even after retries, so it is a non recoverable exception I think), hence
stage fails and then the job gets aborted.

Exception ---
java.io.IOException: org.apache.spark.SparkException: Failed to get
broadcast_10_piece0 of broadcast_10

Any idea why this exception occurs and how to avoid/handle these
exceptions? Please let me know if you have seen this exception and know a
fix for it.

Thanks,
Bharath

Reply via email to