Thanks. Actually I've find the way. I'm using spark-submit to submit the
job the a YARN cluster with --mater yarn-cluster (which spark-submit
process is not the driver). So I can config
"spark.yarn.submit.waitAppComplettion" to "false" so that the process will
exit after the job is submitted.

ayan guha <guha.a...@gmail.com>于2015年7月8日周三 下午12:26写道:

> spark-submit is nothing but a process in your OS, so you should be able to
> submit it in background and exit. However, your spark-submit process itself
> is the driver for your spark streaming application, so it will not exit for
> the lifetime of the streaming app.
>
> On Wed, Jul 8, 2015 at 1:13 PM, Bin Wang <wbi...@gmail.com> wrote:
>
>> I'm writing a streaming application and want to use spark-submit to
>> submit it to a YARN cluster. I'd like to submit it in a client node and
>> exit spark-submit after the application is running. Is it possible?
>>
>
>
>
> --
> Best Regards,
> Ayan Guha
>

Reply via email to