-c CORES, --cores CORES Total CPU cores to allow Spark applications to use
on the machine (default: all available); only on worker

bq. sc.getConf().set()

I think you should use this pattern (shown in
https://spark.apache.org/docs/latest/spark-standalone.html):

val conf = new SparkConf()
             .setMaster(...)
             .setAppName(...)
             .set("spark.cores.max", "1")val sc = new SparkContext(conf)


On Wed, Mar 30, 2016 at 5:46 AM, vetal king <greenve...@gmail.com> wrote:

> Hi all,
>
> While submitting Spark Job I am am specifying options --executor-cores 1
> and --driver-cores 1. However, when the job was submitted, the job used all
> available cores. So I tried to limit the cores within my main function
>         sc.getConf().set("spark.cores.max", "1"); however it still used all
> available cores
>
> I am using Spark in standalone mode (spark://<hostname>:7077)
>
> Any idea what I am missing?
> Thanks in Advance,
>
> Shridhar
>
>

Reply via email to