You would probably like to see http://spark.apache.org/docs/latest/configuration.html#memory-management. Other config parameters are also explained there.
On Fri, Feb 5, 2016 at 10:56 AM, charles li <charles.up...@gmail.com> wrote: > if set spark.executor.memory = 2G for each worker [ 10 in total ] > > does it mean I can cache 20G RDD in memory ? if so, how about the memory > for code running in each process on each worker? > > thanks. > > > -- > and is there any materials about memory management or resource management > in spark ? I want to put spark in production, but have little knowing about > the resource management in spark, great thanks again > > > -- > *--------------------------------------* > a spark lover, a quant, a developer and a good man. > > http://github.com/litaotao > -- Regards, Rishitesh Mishra, SnappyData . (http://www.snappydata.io/) https://in.linkedin.com/in/rishiteshmishra