Hi all,

While submitting Spark Job I am am specifying options --executor-cores 1
and --driver-cores 1. However, when the job was submitted, the job used all
available cores. So I tried to limit the cores within my main function
        sc.getConf().set("spark.cores.max", "1"); however it still used all
available cores

I am using Spark in standalone mode (spark://<hostname>:7077)

Any idea what I am missing?
Thanks in Advance,

Shridhar

Reply via email to