I have an nfs directory "/data" that all my nodes can read/write to. I
installed Spark in there:

bash-3.2$ echo $SPARK_HOME_DATA
/data/test/code/spark-0.7.3

Now, I don't want to communicate with '/data' all the time, since that
will stress my network. Aside from logs, I don't think it will be a
problem. I've set my spark-env.conf to look like:

#!/usr/bin/env bash

SCALA_HOME=/data/test/code/scala-2.9.3
SPARK_MASTER_IP=10.10.1.19
SPARK_WORKER_INSTANCES=1
SPARK_WORKER_DIR=/tmp/spark-worker-dir

I want to log to $SPARK_WORKER_DIR. But when I start the cluster logs
are going to '/data':

10.10.1.18: starting spark.deploy.worker.Worker, logging to
/data/test/code/spark-0.7.3/bin/../logs/spark-rfcompton-spark.deploy.worker.Worker-1-node18.out

What's wrong?

Reply via email to