We were using Yarn. thanks.

On Sun, Jul 10, 2022 at 9:02 PM Tufan Rakshit <tufan...@gmail.com> wrote:

> Mainly depends what your cluster manager Yarn or kubernates ?
> Best
> Tufan
>
> On Sun, 10 Jul 2022 at 14:38, Sean Owen <sro...@gmail.com> wrote:
>
>> Jobs consist of tasks, each of which consumes a core (can be set to >1
>> too, but that's a different story). If there are more tasks ready to
>> execute than available cores, some tasks simply wait.
>>
>> On Sun, Jul 10, 2022 at 3:31 AM Yong Walt <yongw...@gmail.com> wrote:
>>
>>> given my spark cluster has 128 cores totally.
>>> If the jobs (each job was assigned only one core) I submitted to the
>>> cluster are over 128, what will happen?
>>>
>>> Thank you.
>>>
>>

Reply via email to