You are correct.
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org
Each partition should be translated into one task which should run in one
executor. But one executor can process more than one task. I may be wrong,
and will be grateful if someone can correct me.
Regards,
Gourav
On Wed, Apr 4, 2018 at 8:13 PM, Thodoris Zois wrote:
>
> Hello
Hello list!
I am trying to familiarize with Apache Spark. I would like to ask something
about partitioning and executors.
Can I have e.g: 500 partitions but launch only one executor that will run
operations in only 1 partition of the 500? And then I would like my job to die.
Is there any