But is it possible to make t resizable? When we don't have many RDD to
cache, we can give some memory to others.
2014-09-04 13:45 GMT+08:00 Patrick Wendell pwend...@gmail.com:
Changing this is not supported, it si immutable similar to other spark
configuration settings.
On Wed, Sep 3, 2014
Thanks raymond.
I duplicated the question. Please see the reply here. [?]
2014-09-04 14:27 GMT+08:00 牛兆捷 nzjem...@gmail.com:
But is it possible to make t resizable? When we don't have many RDD to
cache, we can give some memory to others.
2014-09-04 13:45 GMT+08:00 Patrick Wendell
limit.
Best Regards,
Raymond Liu
From: 牛兆捷 [mailto:nzjem...@gmail.com]
Sent: Thursday, September 04, 2014 2:27 PM
To: Patrick Wendell
Cc: user@spark.apache.org; d...@spark.apache.org
Subject: Re: memory size for caching RDD
But is it possible to make t resizable? When we don't have many RDD to cache
. spark.shuffle.memoryFraction which you also set the up limit.
Best Regards,
*Raymond Liu*
*From:* 牛兆捷 [mailto:nzjem...@gmail.com]
*Sent:* Thursday, September 04, 2014 2:27 PM
*To:* Patrick Wendell
*Cc:* user@spark.apache.org; d...@spark.apache.org
*Subject:* Re: memory size for caching RDD
Regards,
Raymond Liu
From: 牛兆捷 [mailto:nzjem...@gmail.com]
Sent: Thursday, September 04, 2014 2:57 PM
To: Liu, Raymond
Cc: Patrick Wendell; user@spark.apache.org; d...@spark.apache.org
Subject: Re: memory size for caching RDD
Oh I see.
I want to implement something like this: sometimes I need
@spark.apache.org; d...@spark.apache.org
Subject: Re: memory size for caching RDD
Oh I see.
I want to implement something like this: sometimes I need to release some
memory for other usage even when they are occupied by some RDDs (can be
recomputed with the help of lineage when they are needed
Changing this is not supported, it si immutable similar to other spark
configuration settings.
On Wed, Sep 3, 2014 at 8:13 PM, 牛兆捷 nzjem...@gmail.com wrote:
Dear all:
Spark uses memory to cache RDD and the memory size is specified by
spark.storage.memoryFraction.
One the Executor starts,