Anyone can help? How can I configure a different spark.local.dir for each executor?
On 23 Mar, 2014, at 12:11 am, Tsai Li Ming <mailingl...@ltsai.com> wrote: > Hi, > > Each of my worker node has its own unique spark.local.dir. > > However, when I run spark-shell, the shuffle writes are always written to > /tmp despite being set when the worker node is started. > > By specifying the spark.local.dir for the driver program, it seems to > override the executor? Is there a way to properly define it in the worker > node? > > Thanks!