Github user cloud-fan commented on a diff in the pull request: https://github.com/apache/spark/pull/19852#discussion_r154852742 --- Diff: core/src/main/scala/org/apache/spark/api/python/PythonRunner.scala --- @@ -319,7 +319,7 @@ private[spark] abstract class BasePythonRunner[IN, OUT]( case e: Exception if env.isStopped => logDebug("Exception thrown after context is stopped", e) - null.asInstanceOf[OUT] // exit silently + throw new SparkException("Spark session has been stopped", e) --- End diff -- It seems like java task doesn't check `env.isStopped` when task failed, and only check it when retry failed. Does python task have retry mechanism?
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org