How to set different executor memory limits for different worker nodes? 

I'm using spark 1.5.2 in standalone deployment mode and launching using
scripts. The executor memory is set via 'spark.executor.memory' in
conf/spark-defaults.conf. This sets the same memory limit for all the worker
nodes. I would like to make it such one can set a different limit for
different nodes.

(Spark 1.5.2, ubuntu 14.04)

Thanks,



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Different-executor-memory-for-different-nodes-tp26005.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to