I am using CDH5.1 and Spark 1.0.0.

Trying to configure resources to be allocated to each application. How do I
do this? For example, I would each app to use 2 cores and 8G of RAM. I have
tried using the pyspark commandline paramaters for --driver-memory,
--driver-cores and see no effect of those changes in the Spark Master web UI
when the app is started.

Is there anyway to do this from inside Cloudera Manager also?

Thanks.




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-0-0-Standalone-mode-config-tp20609.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to