Re: Limit Spark Shuffle Disk Usage

2015-06-17 Thread Al M
Thanks Himanshu and RahulKumar! The databricks forum post was extremely useful. It is great to see an article that clearly details how and when shuffles are cleaned up. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Limit-Spark-Shuffle-Disk-Usage

Re: Limit Spark Shuffle Disk Usage

2015-06-16 Thread Himanshu Mehra
'spark.shuffle.memoryFraction' to 0.4 which is by default 0.2 this should make a significant difference in disk use of shuffle. Thank you - Himanshu Mehra -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Limit-Spark-Shuffle-Disk-Usage-tp23279p23334.html Sent from the Apache Spark

Re: Limit Spark Shuffle Disk Usage

2015-06-15 Thread rahulkumar-aws
(SigmoidAnalytics), India -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Limit-Spark-Shuffle-Disk-Usage-tp23279p23323.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Re: Limit Spark Shuffle Disk Usage

2015-06-12 Thread Akhil Das
there for a good reason. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Limit-Spark-Shuffle-Disk-Usage-tp23279.html Sent from the Apache Spark User List mailing list archive at Nabble.com

Limit Spark Shuffle Disk Usage

2015-06-11 Thread Al M
in context: http://apache-spark-user-list.1001560.n3.nabble.com/Limit-Spark-Shuffle-Disk-Usage-tp23279.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr