System properties and environment variables are two different things.. One
can use spark.executor.extraJavaOptions to pass system properties and
spark-env.sh to pass environment variables.

-raghav

On Mon, Aug 24, 2015 at 1:00 PM, Hemant Bhanawat <hemant9...@gmail.com>
wrote:

> That's surprising. Passing the environment variables using
> spark.executor.extraJavaOptions=-Dmyenvvar=xxx to the executor and then
> fetching them using System.getProperty("myenvvar") has worked for me.
>
> What is the error that you guys got?
>
> On Mon, Aug 24, 2015 at 12:10 AM, Sathish Kumaran Vairavelu <
> vsathishkuma...@gmail.com> wrote:
>
>> spark-env.sh works for me in Spark 1.4 but not
>> spark.executor.extraJavaOptions.
>>
>> On Sun, Aug 23, 2015 at 11:27 AM Raghavendra Pandey <
>> raghavendra.pan...@gmail.com> wrote:
>>
>>> I think the only way to pass on environment variables to worker node is
>>> to write it in spark-env.sh file on each worker node.
>>>
>>> On Sun, Aug 23, 2015 at 8:16 PM, Hemant Bhanawat <hemant9...@gmail.com>
>>> wrote:
>>>
>>>> Check for spark.driver.extraJavaOptions and
>>>> spark.executor.extraJavaOptions in the following article. I think you can
>>>> use -D to pass system vars:
>>>>
>>>> spark.apache.org/docs/latest/configuration.html#runtime-environment
>>>> Hi,
>>>>
>>>> I am starting a spark streaming job in standalone mode with
>>>> spark-submit.
>>>>
>>>> Is there a way to make the UNIX environment variables with which
>>>> spark-submit is started available to the processes started on the worker
>>>> nodes?
>>>>
>>>> Jan
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>>>> For additional commands, e-mail: user-h...@spark.apache.org
>>>>
>>>>
>>>
>

Reply via email to