Hello,

I am trying to use the default Spark cluster manager in a production
environment. I will be submitting jobs with spark-submit. I wonder if the
following is possible:

1. Get the Driver ID from spark-submit. We will use this ID to keep track
of the job and kill it if necessary.

2. Weather it is possible to run spark-submit in a mode where it ends and
returns control to the user immediately after the job is submitted.

Thanks!
Rares

Reply via email to