Each partition should be translated into one task which should run in one
executor. But one executor can process more than one task. I may be wrong,
and will be grateful if someone can correct me.

Regards,
Gourav

On Wed, Apr 4, 2018 at 8:13 PM, Thodoris Zois <z...@ics.forth.gr> wrote:

>
> Hello list!
>
> I am trying to familiarize with Apache Spark. I  would like to ask
> something about partitioning and executors.
>
> Can I have e.g: 500 partitions but launch only one executor that will run
> operations in only 1 partition of the 500? And then I would like my job to
> die.
>
> Is there any easy way? Or i have to modify code to achieve that?
>
> Thank you,
>  Thodoris
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to