I'm running on EC2 and I want to set the directory to use on the slaves
(mounted EBS volumes).
I have set:
spark.local.dir /vol3/my-spark-dir
in
/root/spark/conf/spark-defaults.conf
and replicated to all nodes. I have verified that in the console the value
in the config corresponds. I
Can you try setting SPARK_LOCAL_DIRS in spark-env.sh ?
Cheers
On Fri, Feb 6, 2015 at 7:30 AM, Joe Wass jw...@crossref.org wrote:
I'm running on EC2 and I want to set the directory to use on the slaves
(mounted EBS volumes).
I have set:
spark.local.dir /vol3/my-spark-dir
in
Did you restart the slaves so they would read the settings? You don't need
to start/stop the EC2 cluster, just the slaves. From the master node:
$SPARK_HOME/sbin/stop-slaves.sh
$SPARK_HOME/sbin/start-slaves.sh
($SPARK_HOME is probably /root/spark)
On Fri Feb 06 2015 at 10:31:18 AM Joe Wass