Changing this is not supported, it si immutable similar to other spark configuration settings.
On Wed, Sep 3, 2014 at 8:13 PM, 牛兆捷 <nzjem...@gmail.com> wrote: > Dear all: > > Spark uses memory to cache RDD and the memory size is specified by > "spark.storage.memoryFraction". > > One the Executor starts, does Spark support adjusting/resizing memory size > of this part dynamically? > > Thanks. > > -- > *Regards,* > *Zhaojie* --------------------------------------------------------------------- To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org For additional commands, e-mail: dev-h...@spark.apache.org