[
https://issues.apache.org/jira/browse/SPARK-21881?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Kai Londenberg updated SPARK-21881:
---
Description:
This is a duplicate of SPARK-18523, which was not really fixed for me (PySpark
[
https://issues.apache.org/jira/browse/SPARK-21881?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Kai Londenberg updated SPARK-21881:
---
Affects Version/s: (was: 1.6.1)
(was: 2.0.0)
> Again: OOM kill
[
https://issues.apache.org/jira/browse/SPARK-21881?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Kai Londenberg updated SPARK-21881:
---
Description:
This is a duplicate of SPARK-18523, which was not really fixed for me (PySpark
[
https://issues.apache.org/jira/browse/SPARK-21881?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Kai Londenberg updated SPARK-21881:
---
Fix Version/s: (was: 2.1.0)
> Again: OOM killer may leave SparkContext in broken state ca
[
https://issues.apache.org/jira/browse/SPARK-21881?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Kai Londenberg updated SPARK-21881:
---
Affects Version/s: 2.2.0
> Again: OOM killer may leave SparkContext in broken state causing C
Kai Londenberg created SPARK-21881:
--
Summary: Again: OOM killer may leave SparkContext in broken state
causing Connection Refused errors
Key: SPARK-21881
URL: https://issues.apache.org/jira/browse/SPARK-21881
[
https://issues.apache.org/jira/browse/SPARK-18523?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16148544#comment-16148544
]
Kai Londenberg edited comment on SPARK-18523 at 8/31/17 7:06 AM:
--
[
https://issues.apache.org/jira/browse/SPARK-18523?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16148544#comment-16148544
]
Kai Londenberg edited comment on SPARK-18523 at 8/31/17 7:06 AM:
--
[
https://issues.apache.org/jira/browse/SPARK-18523?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16148544#comment-16148544
]
Kai Londenberg commented on SPARK-18523:
In PySpark 2.2.0 this issue was not real