[ https://issues.apache.org/jira/browse/SPARK-34087?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17263266#comment-17263266 ]
Fu Chen edited comment on SPARK-34087 at 1/12/21, 12:02 PM: ------------------------------------------------------------ *bug replay* here is code for replay this bug: {code:java} test("bug replay") { (1 to 1000).foreach(i => { spark.cloneSession() SparkSession.clearActiveSession() }) val cnt = spark.sparkContext .listenerBus .listeners .asScala .collect{ case e: ExecutionListenerBus => e} .size println(s"total ExecutionListenerBus count ${cnt}.") Thread.sleep(Int.MaxValue) } {code} *output:* total ExecutionListenerBus count 1001. *jmap* !1610451044690.jpg! Each ExecutionListenerBus holds one SparkSession instance, so JVM can't collect these SparkSession object was (Author: fchen): *bug replay* here is code for replay this bug: {code:java} test("bug replay") { (1 to 1000).foreach(i => { spark.cloneSession() }) val cnt = spark.sparkContext .listenerBus .listeners .asScala .collect{ case e: ExecutionListenerBus => e} .size println(s"total ExecutionListenerBus count ${cnt}.") Thread.sleep(Int.MaxValue) } {code} *output:* total ExecutionListenerBus count 1001. *jmap* !1610451044690.jpg! Each ExecutionListenerBus holds one SparkSession instance, so JVM can't collect these SparkSession object > a memory leak occurs when we clone the spark session > ---------------------------------------------------- > > Key: SPARK-34087 > URL: https://issues.apache.org/jira/browse/SPARK-34087 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 3.0.1 > Reporter: Fu Chen > Priority: Major > Attachments: 1610451044690.jpg > > > In Spark-3.0.1, the memory leak occurs when we keep cloning the spark session > because a new ExecutionListenerBus instance will add to AsyncEventQueue when > we clone a new session. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org