Sorry, yes, you are right, the documentation does indeed explain that setting
spark.* options is the way to pass Spark configuration options to workers.
Additionally, we've use the same mechanism to pass application-specific
configuration options to workers; the "hack" part is naming our
application-specific options "spark.myapp.*", which relies on the Spark
library just copying around spark.* options without checking to see whether
the option names are valid Spark options.

Regards,

Theodore



-----
-- 
Theodore Wong <t...@tmwong.org>
www.tmwong.org

--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/How-to-pass-config-variables-to-workers-tp5780p5916.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to