thanks i will check our SparkSubmit class

On Fri, Mar 30, 2018 at 2:46 PM, Marcelo Vanzin <van...@cloudera.com> wrote:

> Why: it's part historical, part "how else would you do it".
>
> SparkConf needs to read properties read from the command line, but
> SparkConf is something that user code instantiates, so we can't easily
> make it read data from arbitrary locations. You could use thread
> locals and other tricks, but user code can always break those.
>
> Where: this is done by the SparkSubmit class (look for the Scala
> version, "sys.props").
>
>
> On Fri, Mar 30, 2018 at 11:41 AM, Koert Kuipers <ko...@tresata.com> wrote:
> > does anyone know why all spark settings end up being system properties,
> and
> > where this is done?
> >
> > for example when i pass "--conf spark.foo=bar" into spark-submit then
> > System.getProperty("spark.foo") will be equal to "bar"
> >
> > i grepped the spark codebase for System.setProperty or
> System.setProperties
> > and i see it being used in some places but never for all spark settings.
> >
> > we are running into some weird side effects because of this since we use
> > typesafe config which has system properties as overrides so we see them
> pop
> > up there again unexpectedly.
>
>
>
> --
> Marcelo
>

Reply via email to