How do I set spark.local.dirs?
I'm running on EC2 and I want to set the directory to use on the slaves (mounted EBS volumes). I have set: spark.local.dir /vol3/my-spark-dir in /root/spark/conf/spark-defaults.conf and replicated to all nodes. I have verified that in the console the value in the config corresponds. I have checked that these values are present in nodes. But it's still creating temp files in the wrong (default) place: /mnt2/spark How do I get my slaves to pick up this value? How can I verify that they have? Thanks! Joe
Re: How do I set spark.local.dirs?
Can you try setting SPARK_LOCAL_DIRS in spark-env.sh ? Cheers On Fri, Feb 6, 2015 at 7:30 AM, Joe Wass jw...@crossref.org wrote: I'm running on EC2 and I want to set the directory to use on the slaves (mounted EBS volumes). I have set: spark.local.dir /vol3/my-spark-dir in /root/spark/conf/spark-defaults.conf and replicated to all nodes. I have verified that in the console the value in the config corresponds. I have checked that these values are present in nodes. But it's still creating temp files in the wrong (default) place: /mnt2/spark How do I get my slaves to pick up this value? How can I verify that they have? Thanks! Joe
Re: How do I set spark.local.dirs?
Did you restart the slaves so they would read the settings? You don't need to start/stop the EC2 cluster, just the slaves. From the master node: $SPARK_HOME/sbin/stop-slaves.sh $SPARK_HOME/sbin/start-slaves.sh ($SPARK_HOME is probably /root/spark) On Fri Feb 06 2015 at 10:31:18 AM Joe Wass jw...@crossref.org wrote: I'm running on EC2 and I want to set the directory to use on the slaves (mounted EBS volumes). I have set: spark.local.dir /vol3/my-spark-dir in /root/spark/conf/spark-defaults.conf and replicated to all nodes. I have verified that in the console the value in the config corresponds. I have checked that these values are present in nodes. But it's still creating temp files in the wrong (default) place: /mnt2/spark How do I get my slaves to pick up this value? How can I verify that they have? Thanks! Joe