Hello Sandy,
Thank you for your explanation. Then I would at least expect that to be
consistent across local, yarn-client, and yarn-cluster modes. (And not lead
to the case where it somehow works in two of them, and not for the third).
Kind regards,
Emre Sevinç
http://www.bigindustries.be/
On
Ah, yes, I believe this is because only properties prefixed with spark
get passed on. The purpose of the --conf option is to allow passing
Spark properties to the SparkConf, not to add general key-value pairs to
the JVM system properties.
-Sandy
On Tue, Mar 24, 2015 at 4:25 AM, Emre Sevinc
Hello,
According to Spark Documentation at
https://spark.apache.org/docs/1.2.1/submitting-applications.html :
--conf: Arbitrary Spark configuration property in key=value format. For
values that contain spaces wrap “key=value” in quotes (as shown).
And indeed, when I use that parameter, in my
Hi Emre,
The --conf property is meant to work with yarn-cluster mode.
System.getProperty(key) isn't guaranteed, but new SparkConf().get(key)
should. Does it not?
-Sandy
On Mon, Mar 23, 2015 at 8:39 AM, Emre Sevinc emre.sev...@gmail.com wrote:
Hello,
According to Spark Documentation at