Thanks Akhil , it was a simple fix which you told .. I missed it .. ☺
From: Akhil Das [mailto:ak...@sigmoidanalytics.com]
Sent: Wednesday, February 25, 2015 12:48 PM
To: Somnath Pandeya
Cc: user@spark.apache.org
Subject: Re: used cores are less then total no. of core
You can set the following in
Try adding --total-executor-cores 5 , where 5 is the number of cores.
Thanks,
Vishnu
On Wed, Feb 25, 2015 at 11:52 AM, Somnath Pandeya
somnath_pand...@infosys.com wrote:
Hi All,
I am running a simple word count example of spark (standalone cluster) ,
In the UI it is showing
For each
You can set the following in the conf while creating the SparkContext (if
you are not using spark-submit)
.set(spark.cores.max, 32)
Thanks
Best Regards
On Wed, Feb 25, 2015 at 11:52 AM, Somnath Pandeya
somnath_pand...@infosys.com wrote:
Hi All,
I am running a simple word count