Dear all:

Spark uses memory to cache RDD and the memory size is specified by
"spark.storage.memoryFraction".

One the Executor starts, does Spark support adjusting/resizing memory size
of this part dynamically?

Thanks.

-- 
*Regards,*
*Zhaojie*

Reply via email to