Hi Emre,

The --conf property is meant to work with yarn-cluster mode.
System.getProperty("key") isn't guaranteed, but new SparkConf().get("key")
should.  Does it not?

-Sandy

On Mon, Mar 23, 2015 at 8:39 AM, Emre Sevinc <emre.sev...@gmail.com> wrote:

> Hello,
>
> According to Spark Documentation at
> https://spark.apache.org/docs/1.2.1/submitting-applications.html :
>
>   --conf: Arbitrary Spark configuration property in key=value format. For
> values that contain spaces wrap “key=value” in quotes (as shown).
>
> And indeed, when I use that parameter, in my Spark program I can retrieve
> the value of the key by using:
>
>     System.getProperty("key");
>
> This works when I test my program locally, and also in yarn-client mode, I
> can log the value of the key and see that it matches what I wrote in the
> command line, but it returns *null* when I submit the very same program in
> *yarn-cluster* mode.
>
> Why can't I retrieve the value of key given as --conf "key=value" when I
> submit my Spark application in *yarn-cluster* mode?
>
> Any ideas and/or workarounds?
>
>
> --
> Emre Sevinç
> http://www.bigindustries.be/
>
>

Reply via email to