[ 
https://issues.apache.org/jira/browse/SPARK-10458?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-10458:
------------------------------------

    Assignee: Apache Spark

> Would like to know if a given Spark Context is stopped or currently stopping
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-10458
>                 URL: https://issues.apache.org/jira/browse/SPARK-10458
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Matt Cheah
>            Assignee: Apache Spark
>            Priority: Minor
>
> I ran into a case where a thread stopped a Spark Context, specifically when I 
> hit the "kill" link from the Spark standalone UI. There was no real way for 
> another thread to know that the context had stopped and thus should have 
> handled that accordingly.
> Checking that the SparkEnv is null is one way, but that doesn't handle the 
> case where the context is in the midst of stopping, and stopping the context 
> may actually not be instantaneous - in my case for some reason the 
> DAGScheduler was taking a non-trivial amount of time to stop.
> Implementation wise I'm more or less requesting the boolean value returned 
> from SparkContext.stopped.get() to be visible in some way. As long as we 
> return the value and not the Atomic Boolean itself (we wouldn't want anyone 
> to be setting this, after all!) it would help client applications check the 
> context's liveliness.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to