Hi,

Maybe this is what you are looking for :
http://spark.apache.org/docs/1.2.0/job-scheduling.html#fair-scheduler-pools

Thanks,

On Mon, Mar 16, 2015 at 8:15 PM, abhi <abhishek...@gmail.com> wrote:

> Hi
> Current all the jobs in spark gets submitted using queue . i have a
> requirement where submitted job will generate another set of jobs with some
> priority , which should again be submitted to spark cluster based on
> priority ? Means job with higher priority should be executed first,    Is
> it feasible  ?
>
> Any help is appreciated ?
>
> Thanks,
> Abhi
>
>

Reply via email to