to give a bit more data on what I'm trying to get - 

I have many tasks I want to run in parallel, so I want each task to catch
small part of the cluster (-> only limited part of my 20 cores in the
cluster)

I have important tasks that I want them to get 10 cores, and I have small
tasks that I want to run with only 1 or 2 cores)



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-standalone-cluster-resource-management-tp23444p23445.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to