On 21 Apr 2015, at 17:34, Richard Marscher <rmarsc...@localytics.com<mailto:rmarsc...@localytics.com>> wrote:
- There are System.exit calls built into Spark as of now that could kill your running JVM. We have shadowed some of the most offensive bits within our own application to work around this. You'd likely want to do that or to do your own Spark fork. For example, if the SparkContext can't connect to your cluster master node when it is created, it will System.exit. people can block "errant" System.exit calls by running under a SecurityManager. Less than ideal (and there's a small performance hit) -but possible