How many cores do you have in your boxes?
looks like you are assigning 32 cores "per" executor - is that what you want?  
are there other applications running on the cluster? you might want to check 
YARN UI to see how many containers are getting allocated to your application. 


On Sep 19, 2014, at 1:37 PM, Soumya Simanta <soumya.sima...@gmail.com> wrote:

> I'm launching a Spark shell with the following parameters
> 
> ./spark-shell --master yarn-client --executor-memory 32g --driver-memory 4g 
> --executor-cores 32 --num-executors 8 
> 
> but when I look at the Spark UI it shows only 209.3 GB total memory. 
> 
> 
> Executors (10)
> Memory: 55.9 GB Used (209.3 GB Total)
> This is a 10 node YARN cluster where each node has 48G of memory. 
> 
> Any idea what I'm missing here? 
> 
> Thanks
> -Soumya
> 
> 
> 

Reply via email to