[jira] [Commented] (SPARK-4194) Exceptions thrown during SparkContext or SparkEnv construction might lead to resource leaks or corrupted global state

2015-04-02 Thread Apache Spark (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-4194?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14393249#comment-14393249
 ] 

Apache Spark commented on SPARK-4194:
-

User 'vanzin' has created a pull request for this issue:
https://github.com/apache/spark/pull/5335

 Exceptions thrown during SparkContext or SparkEnv construction might lead to 
 resource leaks or corrupted global state
 -

 Key: SPARK-4194
 URL: https://issues.apache.org/jira/browse/SPARK-4194
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Reporter: Josh Rosen
Priority: Critical

 The SparkContext and SparkEnv constructors instantiate a bunch of objects 
 that may need to be cleaned up after they're no longer needed.  If an 
 exception is thrown during SparkContext or SparkEnv construction (e.g. due to 
 a bad configuration setting), then objects created earlier in the constructor 
 may not be properly cleaned up.
 This is unlikely to cause problems for batch jobs submitted through 
 {{spark-submit}}, since failure to construct SparkContext will probably cause 
 the JVM to exit, but it is a potentially serious issue in interactive 
 environments where a user might attempt to create SparkContext with some 
 configuration, fail due to an error, and re-attempt the creation with new 
 settings.  In this case, resources from the previous creation attempt might 
 not have been cleaned up and could lead to confusing errors (especially if 
 the old, leaked resources share global state with the new SparkContext).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-4194) Exceptions thrown during SparkContext or SparkEnv construction might lead to resource leaks or corrupted global state

2014-11-02 Thread Josh Rosen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-4194?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14193732#comment-14193732
 ] 

Josh Rosen commented on SPARK-4194:
---

I've marked this a blocker of SPARK-4180, an issue that tries to add exceptions 
when users try to create multiple active SparkContexts in the same JVM.  
PySpark already guards against this, but earlier versions of its error-checking 
code ran into issues where users would fail their initial attempt to create a 
SparkContext and then be unable to create new ones because we didn't clear the 
{{activeSparkContext}} variable after the constructor threw an exception.

To fix this, we need to wrap the constructor code in a {{try}} block. 

 Exceptions thrown during SparkContext or SparkEnv construction might lead to 
 resource leaks or corrupted global state
 -

 Key: SPARK-4194
 URL: https://issues.apache.org/jira/browse/SPARK-4194
 Project: Spark
  Issue Type: Bug
  Components: Spark Core
Reporter: Josh Rosen
Priority: Critical

 The SparkContext and SparkEnv constructors instantiate a bunch of objects 
 that may need to be cleaned up after they're no longer needed.  If an 
 exception is thrown during SparkContext or SparkEnv construction (e.g. due to 
 a bad configuration setting), then objects created earlier in the constructor 
 may not be properly cleaned up.
 This is unlikely to cause problems for batch jobs submitted through 
 {{spark-submit}}, since failure to construct SparkContext will probably cause 
 the JVM to exit, but it is a potentially serious issue in interactive 
 environments where a user might attempt to create SparkContext with some 
 configuration, fail due to an error, and re-attempt the creation with new 
 settings.  In this case, resources from the previous creation attempt might 
 not have been cleaned up and could lead to confusing errors (especially if 
 the old, leaked resources share global state with the new SparkContext).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org