[ 
https://issues.apache.org/jira/browse/SPARK-12162?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15043782#comment-15043782
 ] 

Sasi commented on SPARK-12162:
------------------------------

Hey,
Thanks for the quick response. 
On my JBoss i'm running only  new SparkContext(sparkConf) and  
SQLContext(sparkContext);
I have other machine that run my workers and on the same machine i'm running 
the master.

Is that the right way or am I missing something?

Thanks a lot!
Sasi


> Embedded Spark on JBoss server cause crashing due system.exit when 
> SparkUncaughtExceptionHandler called
> -------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-12162
>                 URL: https://issues.apache.org/jira/browse/SPARK-12162
>             Project: Spark
>          Issue Type: Bug
>            Reporter: Sasi
>            Priority: Critical
>
> Hello,
> I'm running Spark on JBoss and some times i'm getting the following exception:
> {code}
> ERROR : (org.apache.spark.util.SparkUncaughtExceptionHandler:96) 
> -[appclient-registration-retry-thread] Uncaught exception in thread 
> Thread[appclient-registration-retry-thread,5,jboss]
> java.util.concurrent.RejectedExecutionException: Task 
> java.util.concurrent.FutureTask@4e33f83e rejected from 
> java.util.concurrent.ThreadPoolExecutor@35eed68e[Running, pool size = 1, 
> active threads = 0, queued tasks = 0, completed tasks = 3]
> {code}
> Then my JBoss crashed, so I take a look on the source of 
> SparkUncaughtExceptionHandler and I notes that when the exception called it 
> do System.exit(SparkExitCode.UNCAUGHT_EXCEPTION).
> [https://github.com/apache/spark/blob/3bd77b213a9cd177c3ea3c61d37e5098e55f75a5/core/src/main/scala/org/apache/spark/util/SparkUncaughtExceptionHandler.scala]
> Since the System.exit(...) called then my JBoss crash.
> Any workaround/fix that can help me?
> Thanks,
> Sasi



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to