Is there any reliable way to find out the number of executors
programatically - regardless of how the job  is run? A method that
preferably works for spark-standalone, yarn, mesos, regardless whether the
code runs from the shell or not?

Things that I tried and don't work:
- sparkContext.getExecutorMemoryStatus.size - 1 // works from the shell,
does not work if task submitted via  spark-submit
- sparkContext.getConf.getInt("spark.executor.instances", 1) - doesn't work
unless explicitly configured
- call to http://master:8080/json (this used to work, but doesn't anymore?)

I guess I could parse the output html from the Spark UI... but that seems
dumb. is there really no better way?

Thanks,
Virgil.

Reply via email to