if set spark.executor.memory = 2G for each worker [ 10 in total ]
does it mean I can cache 20G RDD in memory ? if so, how about the memory
for code running in each process on each worker?
thanks.
--
and is there any materials about memory management or resource management
in spark ? I want to
You would probably like to see
http://spark.apache.org/docs/latest/configuration.html#memory-management.
Other config parameters are also explained there.
On Fri, Feb 5, 2016 at 10:56 AM, charles li wrote:
> if set spark.executor.memory = 2G for each worker [ 10 in