[ 
https://issues.apache.org/jira/browse/SPARK-7958?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Tathagata Das updated SPARK-7958:
---------------------------------
    Description: StreamingContext.start() can throw exception because 
DStream.validateAtStart() fails (say, checkpoint directory not set for 
StateDStream). But by then JobScheduler, JobGenerator, and ReceiverTracker has 
already started, along with their actors. But those cannot be shutdown because 
the only way to do that is call StreamingContext.stop() which cannot be called 
as the context has not been marked as ACTIVE.  (was: StreamingContext.start() 
throw exception because DStream.validateAtStart() fails (say, checkpoint 
directory not set for StateDStream). But by then JobScheduler, JobGenerator, 
and ReceiverTracker has already started, along with their actors. But those 
cannot be shutdown because the only way to do that is call 
StreamingContext.stop() which cannot be called as the context has not been 
marked as ACTIVE.)

> Failed StreamingContext.start() can leak active actors
> ------------------------------------------------------
>
>                 Key: SPARK-7958
>                 URL: https://issues.apache.org/jira/browse/SPARK-7958
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 1.4.0
>            Reporter: Tathagata Das
>            Assignee: Tathagata Das
>            Priority: Critical
>
> StreamingContext.start() can throw exception because 
> DStream.validateAtStart() fails (say, checkpoint directory not set for 
> StateDStream). But by then JobScheduler, JobGenerator, and ReceiverTracker 
> has already started, along with their actors. But those cannot be shutdown 
> because the only way to do that is call StreamingContext.stop() which cannot 
> be called as the context has not been marked as ACTIVE.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to