Hi Chris,

Thank you for posting the question.
Tuning spark configurations is a tricky task since there are a lot factors
to consider.
The configurations that you listed cover the most them.

To understand the situation that can guide you in making a decision about
tuning:
1) What kind of spark applications are you intending to run?
2) What cluster manager have you decided to go with? 
3) How frequent are these applications going to run? (For the sake of
scheduling)
4) Is this used by multiple users? 
5) What else do you have in the cluster that will interact with Spark? (For
the sake of resolving dependencies)
Personally, I would suggest to have these questions  prior to jumping on the
idea of tuning.
A cluster manager like YARN would help understand the settings for cores and
memory since the applications have to be considered for scheduling.

Hope that helps to start off in the right direction.





-----
Neelesh S. Salian
Cloudera
--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark-defaults-conf-optimal-configuration-tp25641p25642.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to