large corrupt shuffle blocks
>> https://issues.apache.org/jira/browse/SPARK-26089
>>
>> So until 3.0 the only way I can think of is to reduce the size/split your
>> job into many
>>
>> On Thu, Aug 15, 2019 at 4:47 PM Mikhail Pryakhin
>> wrote:
>>
Hello, Spark community!
I've been struggling with my job which constantly fails due to inability to
uncompress some previously compressed blocks while shuffling data.
I use spark 2.2.0 with all the configuration settings left by default (no
specific compression codec is specified). I've