Re: Value for SPARK_EXECUTOR_CORES

2015-05-28 Thread Mulugeta Mammo
Thanks for the valuable information. The blog states: The cores property controls the number of concurrent tasks an executor can run. --executor-cores 5 means that each executor can run a maximum of five tasks at the same time. So, I guess the max number of executor-cores I can assign is the

RE: Value for SPARK_EXECUTOR_CORES

2015-05-28 Thread Evo Eftimov
: Thursday, May 28, 2015 7:24 PM To: Ruslan Dautkhanov Cc: user Subject: Re: Value for SPARK_EXECUTOR_CORES Thanks for the valuable information. The blog states: The cores property controls the number of concurrent tasks an executor can run. --executor-cores 5 means that each executor can run

Re: Value for SPARK_EXECUTOR_CORES

2015-05-28 Thread Ruslan Dautkhanov
It's not only about cores. Keep in mind spark.executor.cores also affects available memeory for each task: From http://blog.cloudera.com/blog/2015/03/how-to-tune-your-apache-spark-jobs-part-2/ The memory available to each task is (spark.executor.memory * spark.shuffle.memoryFraction

Value for SPARK_EXECUTOR_CORES

2015-05-27 Thread Mulugeta Mammo
My executor has the following spec (lscpu): CPU(s): 16 Core(s) per socket: 4 Socket(s): 2 Thread(s) per code: 2 The CPU count is obviously 4*2*2 = 16. My question is what value is Spark expecting in SPARK_EXECUTOR_CORES ? The CPU count (16) or total # of cores (2 * 2 = 4) ? Thanks