Re: Spark Error: Not enough space to cache partition rdd

2016-02-14 Thread ayan guha
Have you tried repartition to larger number of partitions? Also, I would
suggest increase number of executors and give them smaller amount of memory
each.
On 15 Feb 2016 06:49, "gustavolacerdas" <gustavolacer...@gmail.com> wrote:

> I have a machine with 96GB and 24 cores. I'm trying to run a k-means
> algorithm with 30GB of input data. My spark-defaults.conf file are
> configured like that: spark.driver.memory 80g spark.executor.memory 70g
> spark.network.timeout 1200s spark.rdd.compress true
> spark.broadcast.compress true But i always get this error Spark Error: Not
> enough space to cache partition rdd I changed the k-means code to persist
> the rdd.cache(MEMORY_AND_DISK) but didn't work.
> ------
> View this message in context: Spark Error: Not enough space to cache
> partition rdd
> <http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Error-Not-enough-space-to-cache-partition-rdd-tp26222.html>
> Sent from the Apache Spark User List mailing list archive
> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>


Spark Error: Not enough space to cache partition rdd

2016-02-14 Thread gustavolacerdas
I have a machine with 96GB and 24 cores.I'm trying to run a k-means algorithm
with 30GB of input data.My spark-defaults.conf file are configured like
that:spark.driver.memory 80gspark.executor.memory
70gspark.network.timeout   1200sspark.rdd.compress  
truespark.broadcast.compress  trueBut i always get this error Spark
Error: Not enough space to cache partition rddI changed the k-means code to
persist the rdd.cache(MEMORY_AND_DISK) but didn't work.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Error-Not-enough-space-to-cache-partition-rdd-tp26222.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.