Article below gives a good idea.

http://blog.cloudera.com/blog/2015/03/how-to-tune-your-apache-spark-jobs-part-2/

Play around with two configuration (large number of executor with small core, 
and small executor with large core) .  Calculated value have to be conservative 
or it will make the spark jobs unstable.

Thx
tri

From: Veljko Skarich [mailto:veljko.skar...@gmail.com]
Sent: Tuesday, December 15, 2015 3:08 PM
To: user@spark.apache.org
Subject: ideal number of executors per machine

Hi,

I'm looking for suggestions on the ideal number of executors per machine. I run 
my jobs on 64G 32 core machines, and at the moment I have one executor running 
per machine, on the spark standalone cluster.

 I could not find many guidelines for figuring out the ideal number of 
executors; the Spark official documentation merely recommends not having more 
than 64G per executor to avoid GC issues. Anyone have and advice on this?

thank you.

Reply via email to