Thanks Akhil, Richard, Oleg for your quick response .

@Oleg we have actually tried the same thing but unfortunately when we throw
exception Akka framework is catching all exceptions and thinking job failed
and rerunning the spark jobs infinitely.  Since in OneForOneStrategy in
akka ,  max no of retries is set to infinite , is there any way to
configure this value ? Or else is there any other way to solve this problem
?


If we don't throw exception in checkExit()  JVM will exit right ?  Is there
a way to stop JVM exit ?

On Wed, Jun 3, 2015 at 9:01 PM, Oleg Zhurakousky <oleg.zhurakou...@gmail.com
> wrote:

> I am not sure why Spark is relying on System.exit, hopefully someone will
> be able to provide a technical justification for it (very curious to hear
> it), but for your use case you can easily trap System.exit call before JVM
> exit with a simple implementation of SecurityManager and try/catch.
> Here are more details (extracted from some of the code I am using to deal
> with the same problem in Hadoop processes, so it's java but you'll get the
> point)
>
> 1. Create a simple implementation of Security Manager:
>
> public class SystemExitDisallowingSecurityManager extends SecurityManager {
>     @Override
>     public void checkPermission(Permission perm) {
>       // allow everything
>     }
>
>     @Override
>     public void checkPermission(Permission perm, Object context) {
>       // allow everything
>     }
>
>     @Override
>     public void checkExit(int status) {
>       throw new SystemExitException();
>     }
> }
>
> 2. Create SystemExitExcepion
>
> public class SystemExitException extends RuntimeException {
>   public SystemExitException() { }
> }
>
> 3. When instantiating your workflow engine register the aforementioned
> SecurityManager (e.g., in the constructor)
>
>         System.setSecurityManager(new SystemExitDisallowingSecurityManager
> ());
>
> 3.1. In your workflow engine invoke the process in try/catch block
>
> try {
>      // invoke the spark-submit process
> }
> catch (SystemExitException e) {
>      // log some message but allow to continue
> }
>
> When Spark triggers System.exit the SecurityManager will trigger
> SystemExitException which you simply ignore. Or you can even avoid
> triggering SystemExitException all together essentially ignoring all the
> calls to System.exit.
>
> Cheers
> Oleg
>
>
> On Wed, Jun 3, 2015 at 10:55 AM, Richard Marscher <
> rmarsc...@localytics.com> wrote:
>
>> I think the short answer to the question is, no, there is no alternate
>> API that will not use the System.exit calls. You can craft a workaround
>> like is being suggested in this thread. For comparison, we are doing
>> programmatic submission of applications in a long-running client
>> application. To get around these issues we make a shadowed version of some
>> of the Spark code in our application to remove the System.exit calls so
>> instead exceptions bubble up to our application.
>>
>> On Wed, Jun 3, 2015 at 7:19 AM, Akhil Das <ak...@sigmoidanalytics.com>
>> wrote:
>>
>>> Did you try this?
>>>
>>> Create an sbt project like:
>>>
>>>  // Create your context
>>>  val sconf = new
>>> SparkConf().setAppName("Sigmoid").setMaster("spark://sigmoid:7077")
>>>  val sc = new SparkContext(sconf)
>>>
>>>  // Do some computations
>>>  sc.parallelize(1 to 10000).take(10).foreach(println)
>>>
>>>  //Now return the exit status
>>>  System.exit(Some number)
>>>
>>>  Now, make your workflow manager to trigger *sbt run* on the project
>>> instead of using spark-submit.
>>>
>>>
>>>
>>> Thanks
>>> Best Regards
>>>
>>> On Wed, Jun 3, 2015 at 2:18 PM, pavan kumar Kolamuri <
>>> pavan.kolam...@gmail.com> wrote:
>>>
>>>> Hi akhil , sorry i may not conveying the question properly .  Actually
>>>> we are looking to Launch a spark job from a long running workflow manager,
>>>> which invokes spark client via SparkSubmit. Unfortunately the client upon
>>>> successful completion of the application exits with a System.exit(0) or
>>>> System.exit(NON_ZERO) when there is a failure. Question is, Is there an
>>>> alternate  api though which a spark application can be launched which can
>>>> return a exit status back to the caller as opposed to initiating JVM halt.
>>>>
>>>> On Wed, Jun 3, 2015 at 12:58 PM, Akhil Das <ak...@sigmoidanalytics.com>
>>>> wrote:
>>>>
>>>>> Run it as a standalone application. Create an sbt project and do sbt
>>>>> run?
>>>>>
>>>>> Thanks
>>>>> Best Regards
>>>>>
>>>>> On Wed, Jun 3, 2015 at 11:36 AM, pavan kumar Kolamuri <
>>>>> pavan.kolam...@gmail.com> wrote:
>>>>>
>>>>>> Hi guys , i am new to spark . I am using sparksubmit to submit spark
>>>>>> jobs. But for my use case i don't want it to be exit with System.exit . 
>>>>>> Is
>>>>>> there any other spark client which is api friendly other than SparkSubmit
>>>>>> which shouldn't exit with system.exit. Please correct me if i am missing
>>>>>> something.
>>>>>>
>>>>>> Thanks in advance
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> Regards
>>>>>> Pavan Kumar Kolamuri
>>>>>>
>>>>>>
>>>>>
>>>>
>>>>
>>>> --
>>>> Regards
>>>> Pavan Kumar Kolamuri
>>>>
>>>>
>>>
>>
>


-- 
Regards
Pavan Kumar Kolamuri

Reply via email to