[ https://issues.apache.org/jira/browse/SPARK-22655?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16271620#comment-16271620 ]
Li Jin commented on SPARK-22655: -------------------------------- PR: https://github.com/apache/spark/pull/19852 > Fail task instead of complete task silently in PythonRunner during shutdown > --------------------------------------------------------------------------- > > Key: SPARK-22655 > URL: https://issues.apache.org/jira/browse/SPARK-22655 > Project: Spark > Issue Type: Bug > Components: PySpark > Affects Versions: 2.0.2, 2.1.0, 2.2.0 > Reporter: Li Jin > > We have observed in our production environment that during Spark shutdown, if > there are some active tasks, sometimes they will complete with incorrect > results. We've tracked down the issue to a PythonRunner where it is returning > partial result instead of throwing exception during Spark shutdown. > I think the better way to handle this is to have these tasks fail instead of > complete with partial results (complete with partial is always bad IMHO) -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org