Hi,

I have multiple spark deployments using mesos.
I use spark.executor.uri to fetch the spark distribution to executor node.

Every time I upgrade spark, I download the default distribution, and just
add to it custom spark-env.sh to spark/conf folder.

Further more, any change I want to do in spark-env.sh, forces me to
re-package the distribution.

Trying to find a way to provide the executors the location of spark conf
dir by using executor.extraJavaOptions
(-DSPARK_CONF_DIR=/path/on/worker/node), but it doesn't seem to work.

Any idea how I achieve it?

Thanks

Reply via email to