My understanding is if you run multi applications on the work node, then each 
application will have an executorbackend process and an executor as well.



bit1...@163.com
 
From: Judy Nash
Date: 2015-02-26 09:58
To: user@spark.apache.org
Subject: spark standalone with multiple executors in one work node
Hello,
 
Does spark standalone support running multiple executors in one worker node? 
 
It seems yarn has the parameter --num-executors  to set number of executors to 
deploy, but I do not find the equivalent parameter in spark standalone. 
 
 
Thanks,
Judy

Reply via email to