How about using Livy to submit jobs?

On Thu, 17 May 2018 at 7:24 am, Marcelo Vanzin <van...@cloudera.com> wrote:

> You can either:
>
> - set spark.yarn.submit.waitAppCompletion=false, which will make
> spark-submit go away once the app starts in cluster mode.
> - use the (new in 2.3) InProcessLauncher class + some custom Java code
> to submit all the apps from the same "launcher" process.
>
> On Wed, May 16, 2018 at 1:45 PM, Shiyuan <gshy2...@gmail.com> wrote:
> > Hi Spark-users,
> >  I want to submit as many spark applications as the resources permit. I
> am
> > using cluster mode on a yarn cluster.  Yarn can queue and launch these
> > applications without problems. The problem lies on spark-submit itself.
> > Spark-submit starts a jvm which could fail due to insufficient memory on
> the
> > machine where I run spark-submit if many spark-submit jvm are running.
> Any
> > suggestions on how to solve this problem? Thank you!
>
>
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
> --
Best Regards,
Ayan Guha

Reply via email to