Hi Abhi,

You mean each task of a job can have different priority or job generated
via one job can have different priority?



On Tue, Mar 17, 2015 at 11:04 AM, Mark Hamstra <m...@clearstorydata.com>
wrote:

>
> http://apache-spark-developers-list.1001551.n3.nabble.com/Job-priority-td10076.html#a10079
>
> On Mon, Mar 16, 2015 at 10:26 PM, abhi <abhishek...@gmail.com> wrote:
>
>> If i understand correctly , the above document creates pool for priority
>> which is static in nature and has to be defined before submitting the job .
>> .in my scenario each generated task can have different priority.
>>
>> Thanks,
>> Abhi
>>
>>
>> On Mon, Mar 16, 2015 at 9:48 PM, twinkle sachdeva <
>> twinkle.sachd...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> Maybe this is what you are looking for :
>>> http://spark.apache.org/docs/1.2.0/job-scheduling.html#fair-scheduler-pools
>>>
>>> Thanks,
>>>
>>> On Mon, Mar 16, 2015 at 8:15 PM, abhi <abhishek...@gmail.com> wrote:
>>>
>>>> Hi
>>>> Current all the jobs in spark gets submitted using queue . i have a
>>>> requirement where submitted job will generate another set of jobs with some
>>>> priority , which should again be submitted to spark cluster based on
>>>> priority ? Means job with higher priority should be executed first,    Is
>>>> it feasible  ?
>>>>
>>>> Any help is appreciated ?
>>>>
>>>> Thanks,
>>>> Abhi
>>>>
>>>>
>>>
>>>
>>
>

Reply via email to