if set spark.executor.memory = 2G for each worker [ 10 in total ]

does it mean I can cache 20G RDD in memory ? if so, how about the memory
for code running in each process on each worker?

thanks.


--
and is there any materials about memory management or resource management
in spark ? I want to put spark in production, but have little knowing about
the resource management in spark, great thanks again


-- 
*--------------------------------------*
a spark lover, a quant, a developer and a good man.

http://github.com/litaotao

Reply via email to