Hi all,
In the standalone mode, spark.local.dir is ignored after Spark Worker launched.
This is a scenario assuming that we have 1 master node and 1 worker node.
1. $SPARK_HOME/sbin/start-all.sh to launch Spark Master&Worker
2. Modify worker node configuration($SPARK_HOME/conf/spark-defaults.conf) 
spark.local.dir to another directory( /tmp_new).
3. run spark-shell from master node.

Executor in this scenario creates scratch directory like 
"spark-bb0876f2-7fa9-4f15-b790-24252183a4f1" under /tmp not /tmp_new.
Because worker set immutable SparkConf instance at the first time it launched 
and refer to this variable when create new executor which wants to change its 
scratch dir.
Can I change application's spark.local.dir without restarting spark workers?

Thanks,
Jung

Reply via email to