Hello,
I have a question on Spark Dataflow. If I understand correctly, all
received data is sent from the executor to the driver of the application
prior to task creation.
Then the task embeding the data transit from the driver to the executor
in order to be processed.
As executor cannot exchange data themselves, in a shuffle, data also
transit to the driver.
Is that correct ?
Thomas
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org