[ 
https://issues.apache.org/jira/browse/SPARK-17630?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mario Briggs updated SPARK-17630:
---------------------------------
    Summary: jvm-exit-on-fatal-error handler for spark.rpc.netty like there is 
available for akka  (was: jvm-exit-on-fatal-error for spark.rpc.netty like 
there is available for akka)

> jvm-exit-on-fatal-error handler for spark.rpc.netty like there is available 
> for akka
> ------------------------------------------------------------------------------------
>
>                 Key: SPARK-17630
>                 URL: https://issues.apache.org/jira/browse/SPARK-17630
>             Project: Spark
>          Issue Type: Question
>          Components: Spark Core
>    Affects Versions: 1.6.0
>            Reporter: Mario Briggs
>         Attachments: SecondCodePath.txt, firstCodepath.txt
>
>
> Hi,
> I have 2 code-paths from my app that result in a jvm OOM. 
> In the first code path, 'akka.jvm-exit-on-fatal-error' kicks in and shuts 
> down the JVM, so that the caller (py4J) get notified with proper stack trace. 
> Attached stack-trace file (firstCodepath.txt)
> In the 2nd code path (rpc.netty), no such handler kicks in and shutdown the 
> JVM, so the caller does not get notified. 
> Attached stack-trace file (SecondCodepath.txt)
> Is it possible to have an jvm exit handle for the rpc. netty path?
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to