Sam,

This may be of interest, as far as i can see it suggests that a spark
'task' is always executed as a single thread in the JVM.

http://0x0fff.com/spark-architecture/

Thanks,

Jem



On Wed, Aug 26, 2015 at 10:06 AM Samya MAITI <samya.ma...@amadeus.com>
wrote:

> Thanks Jem, I do understand your suggestion. Actually --executor-cores
> alone doesn’t control the number of tasks, but is also governed by
> *spark.task.cpus* (amount of cores dedicated for each task’s execution).
>
>
>
> Reframing my Question*, How many threads can be spawned per executor
> core? Is it in user control? *
>
>
>
> Regards,
>
> Sam
>
>
>
> *From:* Jem Tucker [mailto:jem.tuc...@gmail.com]
> *Sent:* Wednesday, August 26, 2015 2:26 PM
> *To:* Samya MAITI <samya.ma...@amadeus.com>; user@spark.apache.org
> *Subject:* Re: Relation between threads and executor core
>
>
>
> Hi Samya,
>
>
>
> When submitting an application with spark-submit the cores per executor
> can be set with --executor-cores, meaning you can run that many tasks per
> executor concurrently. The page below has some more details on submitting
> applications:
>
>
>
> https://spark.apache.org/docs/latest/submitting-applications.html
>
>
>
> thanks,
>
>
>
> Jem
>
>
>
> On Wed, Aug 26, 2015 at 9:47 AM Samya <samya.ma...@amadeus.com> wrote:
>
> Hi All,
>
> Few basic queries :-
> 1. Is there a way we can control the number of threads per executor core?
> 2. Does this parameter “executor-cores” also has say in deciding how many
> threads to be run?
>
> Regards,
> Sam
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Relation-between-threads-and-executor-core-tp24456.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to