Thanks Akhil , it was a simple fix which you told .. I missed it .. ☺

From: Akhil Das [mailto:ak...@sigmoidanalytics.com]
Sent: Wednesday, February 25, 2015 12:48 PM
To: Somnath Pandeya
Cc: user@spark.apache.org
Subject: Re: used cores are less then total no. of core

You can set the following in the conf while creating the SparkContext  (if you 
are not using spark-submit)

.set("spark.cores.max", "32")



Thanks
Best Regards

On Wed, Feb 25, 2015 at 11:52 AM, Somnath Pandeya 
<somnath_pand...@infosys.com<mailto:somnath_pand...@infosys.com>> wrote:
Hi All,

I am running a simple word count example of spark (standalone cluster) , In the 
UI it is showing
For each worker no. of cores available are 32 ,but while running the jobs only 
5 cores are being used,

What should I do to increase no. of used core or it is selected based on jobs.

Thanks
Somnaht

**************** CAUTION - Disclaimer *****************

This e-mail contains PRIVILEGED AND CONFIDENTIAL INFORMATION intended solely

for the use of the addressee(s). If you are not the intended recipient, please

notify the sender by e-mail and delete the original message. Further, you are 
not

to copy, disclose, or distribute this e-mail or its contents to any other 
person and

any such actions are unlawful. This e-mail may contain viruses. Infosys has 
taken

every reasonable precaution to minimize this risk, but is not liable for any 
damage

you may sustain as a result of any virus in this e-mail. You should carry out 
your

own virus checks before opening the e-mail or attachment. Infosys reserves the

right to monitor and review the content of all messages sent to or from this 
e-mail

address. Messages sent to or from this e-mail address may be stored on the

Infosys e-mail system.

***INFOSYS******** End of Disclaimer ********INFOSYS***


Reply via email to