about cpu cores

2022-07-10 Thread Yong Walt
given my spark cluster has 128 cores totally.
If the jobs (each job was assigned only one core) I submitted to the
cluster are over 128, what will happen?

Thank you.


Re: about cpu cores

2022-07-10 Thread Sean Owen
Jobs consist of tasks, each of which consumes a core (can be set to >1 too,
but that's a different story). If there are more tasks ready to execute
than available cores, some tasks simply wait.

On Sun, Jul 10, 2022 at 3:31 AM Yong Walt  wrote:

> given my spark cluster has 128 cores totally.
> If the jobs (each job was assigned only one core) I submitted to the
> cluster are over 128, what will happen?
>
> Thank you.
>


Re: about cpu cores

2022-07-10 Thread Tufan Rakshit
Mainly depends what your cluster manager Yarn or kubernates ?
Best
Tufan

On Sun, 10 Jul 2022 at 14:38, Sean Owen  wrote:

> Jobs consist of tasks, each of which consumes a core (can be set to >1
> too, but that's a different story). If there are more tasks ready to
> execute than available cores, some tasks simply wait.
>
> On Sun, Jul 10, 2022 at 3:31 AM Yong Walt  wrote:
>
>> given my spark cluster has 128 cores totally.
>> If the jobs (each job was assigned only one core) I submitted to the
>> cluster are over 128, what will happen?
>>
>> Thank you.
>>
>


Re: about cpu cores

2022-07-11 Thread Yong Walt
We were using Yarn. thanks.

On Sun, Jul 10, 2022 at 9:02 PM Tufan Rakshit  wrote:

> Mainly depends what your cluster manager Yarn or kubernates ?
> Best
> Tufan
>
> On Sun, 10 Jul 2022 at 14:38, Sean Owen  wrote:
>
>> Jobs consist of tasks, each of which consumes a core (can be set to >1
>> too, but that's a different story). If there are more tasks ready to
>> execute than available cores, some tasks simply wait.
>>
>> On Sun, Jul 10, 2022 at 3:31 AM Yong Walt  wrote:
>>
>>> given my spark cluster has 128 cores totally.
>>> If the jobs (each job was assigned only one core) I submitted to the
>>> cluster are over 128, what will happen?
>>>
>>> Thank you.
>>>
>>


Re: about cpu cores

2022-07-11 Thread Tufan Rakshit
so as an average every 4 core , you get back 3.6 core in Yarn , but you can
use only 3 .
in Kubernetes you get back 3.6 and also can use 3.6

Best
Tufan

On Mon, 11 Jul 2022 at 11:02, Yong Walt  wrote:

> We were using Yarn. thanks.
>
> On Sun, Jul 10, 2022 at 9:02 PM Tufan Rakshit  wrote:
>
>> Mainly depends what your cluster manager Yarn or kubernates ?
>> Best
>> Tufan
>>
>> On Sun, 10 Jul 2022 at 14:38, Sean Owen  wrote:
>>
>>> Jobs consist of tasks, each of which consumes a core (can be set to >1
>>> too, but that's a different story). If there are more tasks ready to
>>> execute than available cores, some tasks simply wait.
>>>
>>> On Sun, Jul 10, 2022 at 3:31 AM Yong Walt  wrote:
>>>
 given my spark cluster has 128 cores totally.
 If the jobs (each job was assigned only one core) I submitted to the
 cluster are over 128, what will happen?

 Thank you.

>>>


Re: about cpu cores

2022-07-11 Thread Gourav Sengupta
Hi,
please see Sean's answer and please read about parallelism in spark.

Regards,
Gourav Sengupta

On Mon, Jul 11, 2022 at 10:12 AM Tufan Rakshit  wrote:

> so as an average every 4 core , you get back 3.6 core in Yarn , but you
> can use only 3 .
> in Kubernetes you get back 3.6 and also can use 3.6
>
> Best
> Tufan
>
> On Mon, 11 Jul 2022 at 11:02, Yong Walt  wrote:
>
>> We were using Yarn. thanks.
>>
>> On Sun, Jul 10, 2022 at 9:02 PM Tufan Rakshit  wrote:
>>
>>> Mainly depends what your cluster manager Yarn or kubernates ?
>>> Best
>>> Tufan
>>>
>>> On Sun, 10 Jul 2022 at 14:38, Sean Owen  wrote:
>>>
 Jobs consist of tasks, each of which consumes a core (can be set to >1
 too, but that's a different story). If there are more tasks ready to
 execute than available cores, some tasks simply wait.

 On Sun, Jul 10, 2022 at 3:31 AM Yong Walt  wrote:

> given my spark cluster has 128 cores totally.
> If the jobs (each job was assigned only one core) I submitted to the
> cluster are over 128, what will happen?
>
> Thank you.
>