Re: Unable to set cores while submitting Spark job

2016-03-31 Thread Mich Talebzadeh
Hi Shridhar Can you check on Spark GUI whether the number of cores shown per worker is the same as you set up? This shows under column "Cores" HTH Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw

Re: Unable to set cores while submitting Spark job

2016-03-31 Thread vetal king
Ted, Mich, Thanks for your replies. I ended up using sparkConf.set(); and accepted cores as a parameter. But still not sure why spark-submits's executor-cores or driver-cores property did not work. setting cores within main method seems to be bit cumbersome . Thanks again, Shridhar On Wed, Mar

Re: Unable to set cores while submitting Spark job

2016-03-30 Thread Mich Talebzadeh
Hi Ted Can specify the core as follows for example 12 cores?: val conf = new SparkConf(). setAppName("ImportStat"). *setMaster("local[12]").* set("spark.driver.allowMultipleContexts", "true"). set("spark.hadoop.validateOutputSpecs", "false") val sc = new

Re: Unable to set cores while submitting Spark job

2016-03-30 Thread Ted Yu
-c CORES, --cores CORES Total CPU cores to allow Spark applications to use on the machine (default: all available); only on worker bq. sc.getConf().set() I think you should use this pattern (shown in https://spark.apache.org/docs/latest/spark-standalone.html): val conf = new SparkConf()

Unable to set cores while submitting Spark job

2016-03-30 Thread vetal king
Hi all, While submitting Spark Job I am am specifying options --executor-cores 1 and --driver-cores 1. However, when the job was submitted, the job used all available cores. So I tried to limit the cores within my main function sc.getConf().set("spark.cores.max", "1"); however it still