Hello,

Does spark standalone support running multiple executors in one worker node?

It seems yarn has the parameter --num-executors  to set number of executors to 
deploy, but I do not find the equivalent parameter in spark standalone.


Thanks,
Judy

Reply via email to