RE: Spark memory configuration

2015-07-24 Thread PHELIPOT, REMY
Yes it is very helpful. Thank you! -Original Message- From: Alexander Bezzubov [mailto:abezzu...@nflabs.com] Sent: Friday, July 24, 2015 3:58 AM To: users@zeppelin.incubator.apache.org Subject: Re: Spark memory configuration Hi, thank you for your interest in Zeppelin! You just have to

Re: Spark memory configuration

2015-07-23 Thread Alexander Bezzubov
Hi, thank you for your interest in Zeppelin! You just have to set the 'Spark' interpreter properties in 'Interpreters' menu: CPU spark.cores.max: 24 Mem spark.executor.memory 22g You actually can use any of the http://spark.apache.org/docs/latest/configuration.html#application-properties th

Spark memory configuration

2015-07-23 Thread PHELIPOT, REMY
Hello ! I am trying to launch some very greedy processes on a Spark 1.4 Cluster using Zeppelin, and I don't understand how to configure Spark memory properly. I’ve tried to set SPARK_MASTER_MEMORY, SPARK_WORKER_MEMORY and SPARK_EXECUTOR_MEMORY environment variables on the spark cluster nodes, w