Finally, I found the solution:
on the spark context you can set spark.executorEnv.[EnvironmentVariableName]
and these will be available in the environment of the executors
This is in fact documented, but somehow I missed it.
Ok, I went in the direction of system vars since beginning probably because
the question was to pass variables to a particular job.
Anyway, the decision to use either system vars or environment vars would
solely depend on whether you want to make them available to all the spark
processes on a
System properties and environment variables are two different things.. One
can use spark.executor.extraJavaOptions to pass system properties and
spark-env.sh to pass environment variables.
-raghav
On Mon, Aug 24, 2015 at 1:00 PM, Hemant Bhanawat hemant9...@gmail.com
wrote:
That's surprising.
That's surprising. Passing the environment variables using
spark.executor.extraJavaOptions=-Dmyenvvar=xxx to the executor and then
fetching them using System.getProperty(myenvvar) has worked for me.
What is the error that you guys got?
On Mon, Aug 24, 2015 at 12:10 AM, Sathish Kumaran Vairavelu
spark-env.sh works for me in Spark 1.4 but not
spark.executor.extraJavaOptions.
On Sun, Aug 23, 2015 at 11:27 AM Raghavendra Pandey
raghavendra.pan...@gmail.com wrote:
I think the only way to pass on environment variables to worker node is to
write it in spark-env.sh file on each worker node.
I think the only way to pass on environment variables to worker node is to
write it in spark-env.sh file on each worker node.
On Sun, Aug 23, 2015 at 8:16 PM, Hemant Bhanawat hemant9...@gmail.com
wrote:
Check for spark.driver.extraJavaOptions and
spark.executor.extraJavaOptions in the
Check for spark.driver.extraJavaOptions and spark.executor.extraJavaOptions
in the following article. I think you can use -D to pass system vars:
spark.apache.org/docs/latest/configuration.html#runtime-environment
Hi,
I am starting a spark streaming job in standalone mode with spark-submit.
Is