Re: Testing --supervise flag

2016-08-02 Thread Noorul Islam Kamal Malmiyoda
Widening to dev@spark

On Mon, Aug 1, 2016 at 4:21 PM, Noorul Islam K M  wrote:
>
> Hi all,
>
> I was trying to test --supervise flag of spark-submit.
>
> The documentation [1] says that, the flag helps in restarting your
> application automatically if it exited with non-zero exit code.
>
> I am looking for some clarification on that documentation. In this
> context, does application means the driver?
>
> Will the driver be re-launched if an exception is thrown by the
> application? I tested this scenario and the driver is not re-launched.
>
> ~/spark-1.6.1/bin/spark-submit --deploy-mode cluster --master 
> spark://10.29.83.162:6066 --class 
> org.apache.spark.examples.ExceptionHandlingTest 
> /home/spark/spark-1.6.1/lib/spark-examples-1.6.1-hadoop2.6.0.jar
>
> I killed the driver java process using 'kill -9' command and the driver
> is re-launched.
>
> Is this the only scenario were driver will be re-launched? Is there a
> way to simulate non-zero exit code and test the use of --supervise flag?
>
> Regards,
> Noorul
>
> [1] 
> http://spark.apache.org/docs/latest/spark-standalone.html#launching-spark-applications

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Testing --supervise flag

2016-08-01 Thread Noorul Islam K M

Hi all,

I was trying to test --supervise flag of spark-submit.

The documentation [1] says that, the flag helps in restarting your
application automatically if it exited with non-zero exit code.

I am looking for some clarification on that documentation. In this
context, does application means the driver?

Will the driver be re-launched if an exception is thrown by the
application? I tested this scenario and the driver is not re-launched.

~/spark-1.6.1/bin/spark-submit --deploy-mode cluster --master 
spark://10.29.83.162:6066 --class 
org.apache.spark.examples.ExceptionHandlingTest 
/home/spark/spark-1.6.1/lib/spark-examples-1.6.1-hadoop2.6.0.jar

I killed the driver java process using 'kill -9' command and the driver
is re-launched. 

Is this the only scenario were driver will be re-launched? Is there a
way to simulate non-zero exit code and test the use of --supervise flag?

Regards,
Noorul

[1] 
http://spark.apache.org/docs/latest/spark-standalone.html#launching-spark-applications

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org