cloud-fan commented on code in PR #53055:
URL: https://github.com/apache/spark/pull/53055#discussion_r2532558451
##########
core/src/main/scala/org/apache/spark/internal/config/Python.scala:
##########
@@ -138,4 +138,15 @@ private[spark] object Python {
.intConf
.checkValue(_ > 0, "If set, the idle worker max size must be > 0.")
.createOptional
+
+ val PYTHON_DAEMON_KILL_WORKER_ON_FLUSH_FAILURE =
+ ConfigBuilder("spark.python.daemon.killWorkerOnFlushFailure")
+ .doc("When enabled, exceptions raised during output flush operations in
the Python " +
+ "worker managed under Python daemon are not caught, causing the worker
to terminate " +
+ "with the exception. This allows Spark to detect the failure and retry
the task. " +
Review Comment:
launch a new worker and retry the task?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]