What version of Spark are you running?

Try calling sc.defaultParallelism. I’ve found that it is typically set to
the number of worker cores in your cluster.
​


On Fri, Aug 29, 2014 at 3:39 AM, Kevin Jung <itsjb.j...@samsung.com> wrote:

> Hi all
> Spark web ui gives me the information about total cores and used cores.
> I want to get this information programmatically.
> How can I do this?
>
> Thanks
> Kevin
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/how-can-I-get-the-number-of-cores-tp13111.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to