spark.executor.memory ? is used just for cache RDD or both cache RDD and the runtime of cores on worker?

2016-02-04 Thread charles li
if set spark.executor.memory = 2G for each worker [ 10 in total ] does it mean I can cache 20G RDD in memory ? if so, how about the memory for code running in each process on each worker? thanks. -- and is there any materials about memory management or resource management in spark ? I want to

Re: spark.executor.memory ? is used just for cache RDD or both cache RDD and the runtime of cores on worker?

2016-02-04 Thread Rishi Mishra
You would probably like to see http://spark.apache.org/docs/latest/configuration.html#memory-management. Other config parameters are also explained there. On Fri, Feb 5, 2016 at 10:56 AM, charles li wrote: > if set spark.executor.memory = 2G for each worker [ 10 in