You can request number of cores and amount of memory for each executor.
On 27 May 2015 18:25, canan chen ccn...@gmail.com wrote:
Thanks Arush.
My scenario is that In standalone mode, if I have one worker, when I start
spark-shell, there will be one executor launched. But if I have 2 workers,
Thanks Arush.
My scenario is that In standalone mode, if I have one worker, when I start
spark-shell, there will be one executor launched. But if I have 2 workers,
there will be 2 executors launched, so I am wondering the mechanism of
executor allocation.
Is it possible to specify how many
I believe you would be restricted by the number of cores you have in your
cluster. Having a worker running without a core is useless.
On Tue, May 26, 2015 at 3:04 PM, canan chen ccn...@gmail.com wrote:
In spark standalone mode, there will be one executor per worker. I am
wondering how many
In spark standalone mode, there will be one executor per worker. I am
wondering how many executor can I acquire when I submit app ? Is it greedy
mode (as many as I can acquire )?