Have you tried passing --executor-cores or –total-executor-cores as arguments, 
, depending on the spark version?


From: kant kodali [mailto:kanth...@gmail.com]
Sent: Friday, February 17, 2017 5:03 PM
To: Alex Kozlov <ale...@gmail.com>
Cc: user @spark <user@spark.apache.org>
Subject: Re: question on SPARK_WORKER_CORES

Standalone.

On Fri, Feb 17, 2017 at 5:01 PM, Alex Kozlov 
<ale...@gmail.com<mailto:ale...@gmail.com>> wrote:
What Spark mode are you running the program in?

On Fri, Feb 17, 2017 at 4:55 PM, kant kodali 
<kanth...@gmail.com<mailto:kanth...@gmail.com>> wrote:
when I submit a job using spark shell I get something like this


[Stage 0:========>    (36814 + 4) / 220129]



Now all I want is I want to increase number of parallel tasks running from 4 to 
16 so I exported an env variable called SPARK_WORKER_CORES=16 in 
conf/spark-env.sh. I though that should do it but it doesn't. It still shows me 
4. any idea?



Thanks much!




--
Alex Kozlov
(408) 507-4987<tel:(408)%20507-4987>
(650) 887-2135<tel:(650)%20887-2135> efax
ale...@gmail.com<mailto:ale...@gmail.com>

Reply via email to