[ 
https://issues.apache.org/jira/browse/SPARK-29007?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marcelo Vanzin resolved SPARK-29007.
------------------------------------
    Fix Version/s: 3.0.0
         Assignee: Jungtaek Lim
       Resolution: Fixed

> Possible leak of SparkContext in tests / test suites initializing 
> StreamingContext
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-29007
>                 URL: https://issues.apache.org/jira/browse/SPARK-29007
>             Project: Spark
>          Issue Type: Bug
>          Components: DStreams, MLlib, Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Jungtaek Lim
>            Assignee: Jungtaek Lim
>            Priority: Minor
>             Fix For: 3.0.0
>
>
> There're lots of tests creating StreamingContext with creating new 
> SparkContext in its constructor, and we don't have enough guard to prevent 
> leakage of SparkContext in test suites. Ideally we should ensure SparkContext 
> is not leaked between test suites, even between tests if each test creates 
> StreamingContext.
>  
> One of example for leakage is below:
> {noformat}
> [info] *** 4 SUITES ABORTED ***
> [info] *** 131 TESTS FAILED ***
> [error] Error: Total 418, Failed 131, Errors 4, Passed 283, Ignored 1
> [error] Failed tests:
> [error]       org.apache.spark.streaming.scheduler.JobGeneratorSuite
> [error]       org.apache.spark.streaming.ReceiverInputDStreamSuite
> [error]       org.apache.spark.streaming.WindowOperationsSuite
> [error]       org.apache.spark.streaming.StreamingContextSuite
> [error]       org.apache.spark.streaming.scheduler.ReceiverTrackerSuite
> [error]       org.apache.spark.streaming.CheckpointSuite
> [error]       org.apache.spark.streaming.UISeleniumSuite
> [error]       
> org.apache.spark.streaming.scheduler.ExecutorAllocationManagerSuite
> [error]       org.apache.spark.streaming.ReceiverSuite
> [error]       org.apache.spark.streaming.BasicOperationsSuite
> [error]       org.apache.spark.streaming.InputStreamsSuite
> [error] Error during tests:
> [error]       org.apache.spark.streaming.MapWithStateSuite
> [error]       org.apache.spark.streaming.DStreamScopeSuite
> [error]       org.apache.spark.streaming.rdd.MapWithStateRDDSuite
> [error]       org.apache.spark.streaming.scheduler.InputInfoTrackerSuite
>  {noformat}
> {{}}
> {noformat}
> [info] JobGeneratorSuite:
> [info] - SPARK-6222: Do not clear received block data too soon *** FAILED *** 
> (2 milliseconds)
> [info]   org.apache.spark.SparkException: Only one SparkContext should be 
> running in this JVM (see SPARK-2243).The currently running SparkContext was 
> created at:
> [info] org.apache.spark.SparkContext.<init>(SparkContext.scala:82)
> [info] 
> org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:851)
> [info] 
> org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:85)
> [info] 
> org.apache.spark.streaming.TestSuiteBase.setupStreams(TestSuiteBase.scala:317)
> [info] 
> org.apache.spark.streaming.TestSuiteBase.setupStreams$(TestSuiteBase.scala:311)
> [info] 
> org.apache.spark.streaming.CheckpointSuite.setupStreams(CheckpointSuite.scala:209)
> [info] 
> org.apache.spark.streaming.CheckpointSuite.$anonfun$new$3(CheckpointSuite.scala:258)
> [info] scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
> [info] org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
> [info] org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
> [info] org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> [info] org.scalatest.Transformer.apply(Transformer.scala:22)
> [info] org.scalatest.Transformer.apply(Transformer.scala:20)
> [info] org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> [info] org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:149)
> [info] org.scalatest.FunSuiteLike.invokeWithFixture$1(FunSuiteLike.scala:184)
> [info] org.scalatest.FunSuiteLike.$anonfun$runTest$1(FunSuiteLike.scala:196)
> [info] org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> [info] org.scalatest.FunSuiteLike.runTest(FunSuiteLike.scala:196)
> [info] org.scalatest.FunSuiteLike.runTest$(FunSuiteLike.scala:178)
> [info]   at 
> org.apache.spark.SparkContext$.$anonfun$assertNoOtherContextIsRunning$2(SparkContext.scala:2512)
> [info]   at scala.Option.foreach(Option.scala:274)
> [info]   at 
> org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2509)
> [info]   at 
> org.apache.spark.SparkContext$.markPartiallyConstructed(SparkContext.scala:2586)
> [info]   at org.apache.spark.SparkContext.<init>(SparkContext.scala:87)
> [info]   at 
> org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:851)
> [info]   at 
> org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:85)
> [info]   at 
> org.apache.spark.streaming.scheduler.JobGeneratorSuite.$anonfun$new$1(JobGeneratorSuite.scala:65)
> [info]   at 
> scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
> [info]   at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)
> [info]   at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)
> [info]   at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
> [info]   at org.scalatest.Transformer.apply(Transformer.scala:22)
> [info]   at org.scalatest.Transformer.apply(Transformer.scala:20)
> [info]   at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:186)
> [info]   at 
> org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:149)
> [info]   at 
> org.scalatest.FunSuiteLike.invokeWithFixture$1(FunSuiteLike.scala:184)
> [info]   at 
> org.scalatest.FunSuiteLike.$anonfun$runTest$1(FunSuiteLike.scala:196)
> [info]   at org.scalatest.SuperEngine.runTestImpl(Engine.scala:289)
> [info]   at org.scalatest.FunSuiteLike.runTest(FunSuiteLike.scala:196)
> [info]   at org.scalatest.FunSuiteLike.runTest$(FunSuiteLike.scala:178)
> [info]   at 
> org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:56)
> [info]   at 
> org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:221)
> [info]   at 
> org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:214)
> [info]   at 
> org.apache.spark.streaming.scheduler.JobGeneratorSuite.org$scalatest$BeforeAndAfter$$super$runTest(JobGeneratorSuite.scala:30)
> [info]   at org.scalatest.BeforeAndAfter.runTest(BeforeAndAfter.scala:203)
> [info]   at org.scalatest.BeforeAndAfter.runTest$(BeforeAndAfter.scala:192)
> [info]   at 
> org.apache.spark.streaming.scheduler.JobGeneratorSuite.runTest(JobGeneratorSuite.scala:30)
> [info]   at 
> org.scalatest.FunSuiteLike.$anonfun$runTests$1(FunSuiteLike.scala:229)
> [info]   at 
> org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:396)
> [info]   at scala.collection.immutable.List.foreach(List.scala:392)
> [info]   at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:384)
> [info]   at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:379)
> [info]   at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:461)
> [info]   at org.scalatest.FunSuiteLike.runTests(FunSuiteLike.scala:229)
> [info]   at org.scalatest.FunSuiteLike.runTests$(FunSuiteLike.scala:228)
> [info]   at org.scalatest.FunSuite.runTests(FunSuite.scala:1560)
> [info]   at org.scalatest.Suite.run(Suite.scala:1147)
> [info]   at org.scalatest.Suite.run$(Suite.scala:1129)
> [info]   at 
> org.scalatest.FunSuite.org$scalatest$FunSuiteLike$$super$run(FunSuite.scala:1560)
> [info]   at org.scalatest.FunSuiteLike.$anonfun$run$1(FunSuiteLike.scala:233)
> [info]   at org.scalatest.SuperEngine.runImpl(Engine.scala:521)
> [info]   at org.scalatest.FunSuiteLike.run(FunSuiteLike.scala:233)
> [info]   at org.scalatest.FunSuiteLike.run$(FunSuiteLike.scala:232)
> [info]   at 
> org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:56)
> [info]   at 
> org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)
> [info]   at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)
> [info]   at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)
> [info]   at 
> org.apache.spark.streaming.scheduler.JobGeneratorSuite.org$scalatest$BeforeAndAfter$$super$run(JobGeneratorSuite.scala:30)
> [info]   at org.scalatest.BeforeAndAfter.run(BeforeAndAfter.scala:258)
> [info]   at org.scalatest.BeforeAndAfter.run$(BeforeAndAfter.scala:256)
> [info]   at 
> org.apache.spark.streaming.scheduler.JobGeneratorSuite.run(JobGeneratorSuite.scala:30)
> [info]   at 
> org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:314)
> [info]   at 
> org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:507)
> [info]   at sbt.ForkMain$Run$2.call(ForkMain.java:296)
> [info]   at sbt.ForkMain$Run$2.call(ForkMain.java:286)
> [info]   at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> [info]   at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> [info]   at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> [info]   at java.lang.Thread.run(Thread.java:748) {noformat}
> {{}}



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to