Re: Memory are not used according to setting

2015-11-04 Thread Shixiong Zhu
You should use `SparkConf.set` rather than `SparkConf.setExecutorEnv`. For driver configurations, you need to set them before starting your application. You can use the `--conf` argument before running `spark-submit`. Best Regards, Shixiong Zhu 2015-11-04 15:55 GMT-08:00 William Li : > Hi All –

Memory are not used according to setting

2015-11-04 Thread William Li
Hi All - I have a four worker node cluster, each with 8GB memory. When I submit a job, the driver node takes 1gb memory, each worker node only allocates one executor, also just take 1gb memory. The setting of the job has: sparkConf .setExecutorEnv("spark.driver.memory", "6g") .setExecutorEn