To: Praveen Garg praveen.g...@guavus.commailto:praveen.g...@guavus.com
Cc: user@spark.apache.orgmailto:user@spark.apache.org
user@spark.apache.orgmailto:user@spark.apache.org
Subject: Re: Shuffle read/write issue in spark 1.2
Even I observed the same issue.
On Fri, Feb 6, 2015 at 12:19 AM, Praveen
...@gmail.com
Date: Saturday, 7 February 2015 1:22 am
To: Praveen Garg praveen.g...@guavus.commailto:praveen.g...@guavus.com
Cc: Raghavendra Pandey
raghavendra.pan...@gmail.commailto:raghavendra.pan...@gmail.com,
user@spark.apache.orgmailto:user@spark.apache.org
user@spark.apache.orgmailto:user
Try increasing the value of spark.yarn.executor.memoryOverhead. It’s default
value is 384mb in spark 1.1. This error generally comes when your process usage
exceed your max allocation. Use following property to increase memory overhead.
From: Yifan LI