[ https://issues.apache.org/jira/browse/SPARK-24502?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16574908#comment-16574908 ]
Wenchen Fan commented on SPARK-24502: ------------------------------------- There is no resource leakage. Users need to manage the active and default SparkSession manually, by calling `get/set/clearActiveSession` and `get/set/clearDefaultSession`. This is not very user-friendly, but it's what it is. Unfortunately, our test framework had a bug: it didn't clear active/default session when a spark session is stopped. This causes a problem because we use `SQLConf.get` a lot in the test code. My PR fixed it. It's totally fine if you create and close multiple spark sessions in the production code, there is no resource leak. But you need to pay attention if you get active/default session. It's the same in Spark 2.2. I'm adding a safeguard for SQLConf.get: https://issues.apache.org/jira/browse/SPARK-25076 . Hopefully this problem can be eased. > flaky test: UnsafeRowSerializerSuite > ------------------------------------ > > Key: SPARK-24502 > URL: https://issues.apache.org/jira/browse/SPARK-24502 > Project: Spark > Issue Type: Test > Components: SQL > Affects Versions: 2.4.0 > Reporter: Wenchen Fan > Assignee: Wenchen Fan > Priority: Major > Labels: flaky-test > Fix For: 2.3.2, 2.4.0 > > > https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/4193/testReport/org.apache.spark.sql.execution/UnsafeRowSerializerSuite/toUnsafeRow___test_helper_method/ > {code} > sbt.ForkMain$ForkError: java.lang.IllegalStateException: LiveListenerBus is > stopped. > at > org.apache.spark.scheduler.LiveListenerBus.addToQueue(LiveListenerBus.scala:97) > at > org.apache.spark.scheduler.LiveListenerBus.addToStatusQueue(LiveListenerBus.scala:80) > at > org.apache.spark.sql.internal.SharedState.<init>(SharedState.scala:93) > at > org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:120) > at > org.apache.spark.sql.SparkSession$$anonfun$sharedState$1.apply(SparkSession.scala:120) > at scala.Option.getOrElse(Option.scala:121) > at > org.apache.spark.sql.SparkSession.sharedState$lzycompute(SparkSession.scala:120) > at org.apache.spark.sql.SparkSession.sharedState(SparkSession.scala:119) > at > org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:286) > at > org.apache.spark.sql.test.TestSparkSession.sessionState$lzycompute(TestSQLContext.scala:42) > at > org.apache.spark.sql.test.TestSparkSession.sessionState(TestSQLContext.scala:41) > at > org.apache.spark.sql.SparkSession$$anonfun$1$$anonfun$apply$1.apply(SparkSession.scala:95) > at > org.apache.spark.sql.SparkSession$$anonfun$1$$anonfun$apply$1.apply(SparkSession.scala:95) > at scala.Option.map(Option.scala:146) > at > org.apache.spark.sql.SparkSession$$anonfun$1.apply(SparkSession.scala:95) > at > org.apache.spark.sql.SparkSession$$anonfun$1.apply(SparkSession.scala:94) > at org.apache.spark.sql.internal.SQLConf$.get(SQLConf.scala:126) > at > org.apache.spark.sql.catalyst.expressions.CodeGeneratorWithInterpretedFallback.createObject(CodeGeneratorWithInterpretedFallback.scala:54) > at > org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:157) > at > org.apache.spark.sql.catalyst.expressions.UnsafeProjection$.create(Projection.scala:150) > at > org.apache.spark.sql.execution.UnsafeRowSerializerSuite.org$apache$spark$sql$execution$UnsafeRowSerializerSuite$$unsafeRowConverter(UnsafeRowSerializerSuite.scala:54) > at > org.apache.spark.sql.execution.UnsafeRowSerializerSuite.org$apache$spark$sql$execution$UnsafeRowSerializerSuite$$toUnsafeRow(UnsafeRowSerializerSuite.scala:49) > at > org.apache.spark.sql.execution.UnsafeRowSerializerSuite$$anonfun$2.apply(UnsafeRowSerializerSuite.scala:63) > at > org.apache.spark.sql.execution.UnsafeRowSerializerSuite$$anonfun$2.apply(UnsafeRowSerializerSuite.scala:60) > ... > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org