I have a machine with 96GB and 24 cores.I'm trying to run a k-means algorithm
with 30GB of input data.My spark-defaults.conf file are configured like
that:spark.driver.memory                 80gspark.executor.memory            
70gspark.network.timeout               1200sspark.rdd.compress                  
truespark.broadcast.compress          trueBut i always get this error Spark
Error: Not enough space to cache partition rddI changed the k-means code to
persist the rdd.cache(MEMORY_AND_DISK) but didn't work.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Error-Not-enough-space-to-cache-partition-rdd-tp26222.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to