Re: how to debug this kind of error, e.g. lost executor?

2015-02-11 Thread Praveen Garg
Try increasing the value of spark.yarn.executor.memoryOverhead. It’s default value is 384mb in spark 1.1. This error generally comes when your process usage exceed your max allocation. Use following property to increase memory overhead. From: Yifan LI

Re: Shuffle read/write issue in spark 1.2

2015-02-06 Thread Praveen Garg
To: Praveen Garg praveen.g...@guavus.commailto:praveen.g...@guavus.com Cc: user@spark.apache.orgmailto:user@spark.apache.org user@spark.apache.orgmailto:user@spark.apache.org Subject: Re: Shuffle read/write issue in spark 1.2 Even I observed the same issue. On Fri, Feb 6, 2015 at 12:19 AM, Praveen

Re: Shuffle read/write issue in spark 1.2

2015-02-06 Thread Praveen Garg
...@gmail.com Date: Saturday, 7 February 2015 1:22 am To: Praveen Garg praveen.g...@guavus.commailto:praveen.g...@guavus.com Cc: Raghavendra Pandey raghavendra.pan...@gmail.commailto:raghavendra.pan...@gmail.com, user@spark.apache.orgmailto:user@spark.apache.org user@spark.apache.orgmailto:user