[GitHub] spark issue #14658: [WIP][SPARK-5928][SPARK-6238] Remote Shuffle Blocks cann...

2017-12-26 Thread MJFND
Github user MJFND commented on the issue: https://github.com/apache/spark/pull/14658 Okay, but even if not then increasing the number of shuffle partition should fix it, but its not. On Dec 26, 2017 8:51 PM, "Guoqiang Li" wrote: > Spark 2.2 has fix

[GitHub] spark issue #14658: [WIP][SPARK-5928][SPARK-6238] Remote Shuffle Blocks cann...

2017-12-26 Thread MJFND
Github user MJFND commented on the issue: https://github.com/apache/spark/pull/14658 If "Remote Shuffle Blocks cannot be more than 2 GB" then setting up spark.sql.shuffle.partitions=value, where value should be such that it has 2gb per executor, like for 200GB of data, w