[ https://issues.apache.org/jira/browse/SPARK-3888?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14173454#comment-14173454 ]
Apache Spark commented on SPARK-3888: ------------------------------------- User 'davies' has created a pull request for this issue: https://github.com/apache/spark/pull/2743 > Limit the memory used by python worker > -------------------------------------- > > Key: SPARK-3888 > URL: https://issues.apache.org/jira/browse/SPARK-3888 > Project: Spark > Issue Type: Improvement > Components: PySpark > Reporter: Davies Liu > Assignee: Davies Liu > > Right now, we did not limit the memory by Python workers, then it maybe run > out of memory and freeze the OS. it's safe to have a configurable hard > limitation for it, which should be large than spark.executor.python.memory. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org