suppose you have 1 job that do some transformation, suppose you have X
cores in your cluster and you are willing to give all of them to your job
suppose you have no shuffles(to keep it simple)

set number of partitions of your input data to be 3X or 2X, thus you'll get
2/3 tasks per each core

On 3 September 2015 at 15:56, Hans van den Bogert <hansbog...@gmail.com>
wrote:

> The tuning documentations tells us to have 2-3 tasks per CPU core
>
> > In general, we recommend 2-3 tasks per CPU core in your cluster.
>
> I’m wondering how you’d actually accomplish this.
> Setting spark.task.cpus to a fraction like 0.5 or 0.3 does not work.
>
> Perhaps I’m misunderstanding, any advice is welcome,
>
> Regards,
>
> Hans
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to