[ 
https://issues.apache.org/jira/browse/SPARK-12175?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15044640#comment-15044640
 ] 

Sean Owen commented on SPARK-12175:
-----------------------------------

[~sasi2103] I don't know that this is something to fix. As I mentioned, please 
search for other related issues to get more background on the discussion. I 
left questions for you on the other issue.

> Add new flag to Spark that identify if the driver run on application servers.
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-12175
>                 URL: https://issues.apache.org/jira/browse/SPARK-12175
>             Project: Spark
>          Issue Type: Bug
>            Reporter: Sasi
>
> Hi,
> I'm running my driver on JBoss and I have noticed that sometimes there's use 
> case that try to create new SparkContext when Spark master is down, therefore 
> an unhandle exception has throw and 
> [SparkUncaughtExceptionHandler|https://github.com/apache/spark/blob/3bd77b213a9cd177c3ea3c61d37e5098e55f75a5/core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala]
>  
> do system.exist which kill JBoss.
> I think flag should be configured about which env is running the driver.
> For example,
> if driverRunOnAppServer = true then no system.exist will occurs.
> The develop will have to handle cases when Spark master is down or not.
> Best regards,
> Sasi



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to