Sasi created SPARK-12175:
----------------------------

             Summary: Add new flag to Spark that identify if the driver run on 
application servers.
                 Key: SPARK-12175
                 URL: https://issues.apache.org/jira/browse/SPARK-12175
             Project: Spark
          Issue Type: Bug
            Reporter: Sasi


Hi,
I'm running my driver on JBoss and I have noticed that sometimes there's use 
case that try to create new SparkContext when Spark master is down, therefore 
an unhandle exception has throw and 
[SparkUncaughtExceptionHandler|https://github.com/apache/spark/blob/3bd77b213a9cd177c3ea3c61d37e5098e55f75a5/core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala]
 
do system.exist which kill JBoss.

I think flag should be configured about which env is running the driver.
For example,
if driverRunOnAppServer = true then no system.exist will occurs.

The develop will have to handle cases when Spark master is down or not.

Best regards,
Sasi



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to