Not a hack, this is documented here:
http://spark.apache.org/docs/0.9.1/configuration.html, and is in fact the
proper way of setting per-application Spark configurations.

Additionally, you can specify default Spark configurations so you don't
need to manually set it for all applications. If you are running Spark 0.9
or before, then you could set them through the environment variable
SPARK_JAVA_OPTS in conf/spark-env.sh.

As of Spark 1.0, however, this mechanism is deprecated. The new way of
setting default Spark configurations is through conf/spark-defaults.conf in
the following format

spark.config.one value
spark.config.two value2

More details are documented here:
http://people.apache.org/~pwendell/spark-1.0.0-rc7-docs/configuration.html.


2014-05-16 15:16 GMT-07:00 Theodore Wong <t...@tmwong.org>:

> I found that the easiest way was to pass variables in the Spark
> configuration
> object. The only catch is that all of your properties keys must being with
> "spark." in order for Spark to propagate the values. So, for example, in
> the
> driver:
>
> SparkConf conf = new SparkConf();
> conf.set("spark.myapp.myproperty", "propertyValue");
>
> JavaSparkContext context = new JavaSparkContext(conf);
>
> I realize that this is most likely a hack, but it works and is easy (at
> least for me) to follow from a programming standpoint compared to setting
> environment variables outside of the program.
>
> Regards,
>
> Theodore Wong
>
>
>
> -----
> --
> Theodore Wong &lt;t...@tmwong.org&gt;
> www.tmwong.org
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-pass-config-variables-to-workers-tp5780p5880.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to