I've been using SparkConf on my project for quite some time now to store
configuration information for its various components. This has worked very
well thus far in situations where I have control over the creation of the
SparkContext & the SparkConf.

I have run into a bit of a problem trying to integrate this same approach
to the use of the shell, however. I have a bunch of properties in a
properties file that are shared across several different types of
applications (web containers, etc...) but the SparkConf ignores these
properties because they aren't prefixed with spark.*

Is this really necessary? It's not really stopping people from adding their
own properties and it limits the power of being able to utilize one central
configuration object.

Reply via email to