Re: how to run local[k] threads on a single core

2016-08-05 Thread sujeet jog
Thanks,

Since i'm running in local mode,  i plan to pin down the JVM to a CPU with
taskset -cp  ,  hopefully with this all the tasks should operate
on the specified CPU cores..

Thanks,
Sujeet

On Thu, Aug 4, 2016 at 8:11 PM, Daniel Darabos <
daniel.dara...@lynxanalytics.com> wrote:

> You could run the application in a Docker container constrained to one CPU
> with --cpuset-cpus (https://docs.docker.com/engine/reference/run/#/cpuset-
> constraint).
>
> On Thu, Aug 4, 2016 at 8:51 AM, Sun Rui  wrote:
>
>> I don’t think it possible as Spark does not support thread to CPU
>> affinity.
>> > On Aug 4, 2016, at 14:27, sujeet jog  wrote:
>> >
>> > Is there a way we can run multiple tasks concurrently on a single core
>> in local mode.
>> >
>> > for ex :- i have 5 partition ~ 5 tasks, and only a single core , i want
>> these tasks to run concurrently, and specifiy them to use /run on a single
>> core.
>> >
>> > The machine itself is say 4 core, but i want to utilize only 1 core out
>> of it,.
>> >
>> > Is it possible ?
>> >
>> > Thanks,
>> > Sujeet
>> >
>>
>>
>>
>> -
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>
>>
>


Re: how to run local[k] threads on a single core

2016-08-04 Thread Daniel Darabos
You could run the application in a Docker container constrained to one CPU
with --cpuset-cpus (
https://docs.docker.com/engine/reference/run/#/cpuset-constraint).

On Thu, Aug 4, 2016 at 8:51 AM, Sun Rui  wrote:

> I don’t think it possible as Spark does not support thread to CPU affinity.
> > On Aug 4, 2016, at 14:27, sujeet jog  wrote:
> >
> > Is there a way we can run multiple tasks concurrently on a single core
> in local mode.
> >
> > for ex :- i have 5 partition ~ 5 tasks, and only a single core , i want
> these tasks to run concurrently, and specifiy them to use /run on a single
> core.
> >
> > The machine itself is say 4 core, but i want to utilize only 1 core out
> of it,.
> >
> > Is it possible ?
> >
> > Thanks,
> > Sujeet
> >
>
>
>
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>


Re: how to run local[k] threads on a single core

2016-08-03 Thread Sun Rui
I don’t think it possible as Spark does not support thread to CPU affinity.
> On Aug 4, 2016, at 14:27, sujeet jog  wrote:
> 
> Is there a way we can run multiple tasks concurrently on a single core in 
> local mode.
> 
> for ex :- i have 5 partition ~ 5 tasks, and only a single core , i want these 
> tasks to run concurrently, and specifiy them to use /run on a single core. 
> 
> The machine itself is say 4 core, but i want to utilize only 1 core out of 
> it,. 
> 
> Is it possible ?
> 
> Thanks, 
> Sujeet
> 



-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



how to run local[k] threads on a single core

2016-08-03 Thread sujeet jog
Is there a way we can run multiple tasks concurrently on a single core in
local mode.

for ex :- i have 5 partition ~ 5 tasks, and only a single core , i want
these tasks to run concurrently, and specifiy them to use /run on a single
core.

The machine itself is say 4 core, but i want to utilize only 1 core out of
it,.

Is it possible ?

Thanks,
Sujeet