Yes it is very helpful.
Thank you!
-Original Message-
From: Alexander Bezzubov [mailto:abezzu...@nflabs.com]
Sent: Friday, July 24, 2015 3:58 AM
To: users@zeppelin.incubator.apache.org
Subject: Re: Spark memory configuration
Hi,
thank you for your interest in Zeppelin!
You just have to
Hi,
thank you for your interest in Zeppelin!
You just have to set the 'Spark' interpreter properties in 'Interpreters' menu:
CPU
spark.cores.max: 24
Mem
spark.executor.memory 22g
You actually can use any of the
http://spark.apache.org/docs/latest/configuration.html#application-properties
th
Hello !
I am trying to launch some very greedy processes on a Spark 1.4 Cluster using
Zeppelin, and I don't understand how to configure Spark memory properly. I’ve
tried to set SPARK_MASTER_MEMORY, SPARK_WORKER_MEMORY and SPARK_EXECUTOR_MEMORY
environment variables on the spark cluster nodes, w