Hi,

We have some code which worked with Spark 1.5.0, which allowed us to pass
system properties to the executors:

SparkConf sparkConf = new SparkConf().setAppName(appName);
...
for each property:
    sparkConf.setExecutorEnv(propName, propValue);

Our current Spark Streaming driver is compiled with spark-streaming_2.11 :
2.0.0.

We no longer get the properties on the executor side.  Tried the following
approaches and none of these worked:

sparkConf.set("spark.MyPropName", "MyPropValue");
sparkConf.setExecutorEnv("MyProp1", "MyPropVal1");
sparkConf.setExecutorEnv("SPARK_JAVA_OPTS", "-Daaa=bbb");

On the executor side, I'm retrieving with System.getenv; fall back on
System.getProperty() if the property ends up being in the properties rather
than the environment.

The only way I was able to get this to work is by defining a custom Spark
conf file, with every property name being prefixed with "spark." and then
passing the conf properties file via the --properties-file <location> to
spark-submit.

Is anyone aware of a change that may have caused setExecutorEnv to work
differently in 2.0.0?

Is there any other way to pass environment variables or system properties
to the executor side, preferably a programmatic way rather than
configuration-wise?

Thanks,
- Dmitry

Reply via email to