[ https://issues.apache.org/jira/browse/SPARK-25380?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16623087#comment-16623087 ]
Jungtaek Lim commented on SPARK-25380: -------------------------------------- I thought about this as edge case which we might be unsure to address in general, but once two end users report the same thing it doesn't look like odd case. I'm interested on tackling this issue, but without reproducer I can't play with. Could one of you please share "redacted" query which can consistently reproduce the issue? > Generated plans occupy over 50% of Spark driver memory > ------------------------------------------------------ > > Key: SPARK-25380 > URL: https://issues.apache.org/jira/browse/SPARK-25380 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 2.3.1 > Environment: Spark 2.3.1 (AWS emr-5.16.0) > > Reporter: Michael Spector > Priority: Minor > Attachments: Screen Shot 2018-09-06 at 23.19.56.png, Screen Shot > 2018-09-12 at 8.20.05.png, heapdump_OOM.png, image-2018-09-16-14-21-38-939.png > > > When debugging an OOM exception during long run of a Spark application (many > iterations of the same code) I've found that generated plans occupy most of > the driver memory. I'm not sure whether this is a memory leak or not, but it > would be helpful if old plans could be purged from memory anyways. > Attached are screenshots of OOM heap dump opened in JVisualVM. > -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org