Infact, sparkConf.set("spark.whateverPropertyYouWant","Value") gets shipped
to the executors.

Thanks
Best Regards

On Fri, May 1, 2015 at 2:55 PM, Michael Ryabtsev <mich...@totango.com>
wrote:

> Hi,
>
> We've had a similar problem, but with log4j properties file.
> The only working way we've found, was externally deploying the properties
> file on the worker machine to the spark conf folder and configuring the
> executor jvm options with:
>
> sparkConf.set("spark.executor.extraJavaOptions",
> "-Dlog4j.configuration=log4j_integrationhub_sparkexecutor.properties");
>
> It is done in our case from the java code, but I think it can be also a
> parameter to spark-submit.
>
> Regards,
>
> Michael.
>
> On Fri, May 1, 2015 at 2:26 AM, Tian Zhang <tzhang...@yahoo.com> wrote:
>
>> Hi,
>>
>> We have a scenario as below and would like your suggestion.
>> We have app.conf file with propX=A as default built into the fat jar file
>> that is provided to spark-submit
>> WE have env.conf file with propX=B that would like spark-submit to take as
>> input to overwrite the default and populate to both driver and executors.
>> Note in the executor, we are using some package that is using typesafe
>> config to read configuration properties.
>>
>> How do we do that?
>>
>> Thanks.
>>
>> Tian
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-pass-configuration-properties-from-driver-to-executor-tp22728.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> For additional commands, e-mail: user-h...@spark.apache.org
>>
>>
>

Reply via email to