Hi there

What is an optimal cluster setup for spark? Given X amount of resources,
would you favour more worker nodes with less resources or less worker node
with more resources. Is this application dependent? If so what are the
things to consider, what are good practices?

Cheers



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Optimal-Cluster-Setup-for-Spark-tp15007.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to