Hello Sandy,
Thank you for your explanation. Then I would at least expect that to be
consistent across local, yarn-client, and yarn-cluster modes. (And not lead
to the case where it somehow works in two of them, and not for the third).
Kind regards,
Emre Sevinç
http://www.bigindustries.be/
On
Ah, yes, I believe this is because only properties prefixed with "spark"
get passed on. The purpose of the "--conf" option is to allow passing
Spark properties to the SparkConf, not to add general key-value pairs to
the JVM system properties.
-Sandy
On Tue, Mar 24, 2015 at 4:25 AM, Emre Sevinc
Hello Sandy,
Your suggestion does not work when I try it locally:
When I pass
--conf "key=someValue"
and then try to retrieve it like:
SparkConf sparkConf = new SparkConf();
logger.info("* * * key ~~~> {}", sparkConf.get("key"));
I get
Exception in thread "main" java.util.NoSuchE
Hi Emre,
The --conf property is meant to work with yarn-cluster mode.
System.getProperty("key") isn't guaranteed, but new SparkConf().get("key")
should. Does it not?
-Sandy
On Mon, Mar 23, 2015 at 8:39 AM, Emre Sevinc wrote:
> Hello,
>
> According to Spark Documentation at
> https://spark.apa