I have a spark job that has multiple stages. For now i star it with 100
executors, each with 12G mem (max is 16G). I am using Spark 1.3 over YARN
2.4.x.
For now i start the Spark Job with a very limited input (1 file of size
2G), overall there are 200 files. My first run is yet to complete as its
Thanks Sandy, apprechiate
On Thu, Apr 9, 2015 at 10:32 PM, Sandy Ryza wrote:
> Hi Deepak,
>
> I'm going to shamelessly plug my blog post on tuning Spark:
>
> http://blog.cloudera.com/blog/2015/03/how-to-tune-your-apache-spark-jobs-part-2/
>
> It talks about tuning executor size as well as how th
Hi Deepak,
I'm going to shamelessly plug my blog post on tuning Spark:
http://blog.cloudera.com/blog/2015/03/how-to-tune-your-apache-spark-jobs-part-2/
It talks about tuning executor size as well as how the number of tasks for
a stage is calculated.
-Sandy
On Thu, Apr 9, 2015 at 9:21 AM, ÐΞ€ρ@Ҝ