You can get parameter such as spark.executor.memory, but you can not get
executor or core numbers.
Because executor and core are parameters of spark deploy environment not
spark context.

val conf = new SparkConf().set("spark.executor.memory","2G")
val sc = new SparkContext(conf)

sc.getConf.get("spark.executor.memory")
conf.get("spark.executor.memory")

2014-11-21 15:35 GMT+08:00 Tobias Pfeiffer <t...@preferred.jp>:

> Hi,
>
> when running on YARN, is there a way for the Spark driver to know how many
> executors, cores per executor etc. there are? I want to know this so I can
> repartition to a good number.
>
> Thanks
> Tobias
>

Reply via email to