Re: making spark/conf/spark-defaults.conf changes take effect

2014-05-19 Thread Andrew Or
Hm, it should just take effect immediately. But yes, there is a script for syncing everything: /root/spark-ec2/copy-dir --delete /root/spark After that you should do /root/spark/sbin/stop-all.sh /root/spark/sbin/start-all.sh 2014-05-18 16:56 GMT-07:00 Daniel Mahler : > > I am running in an a

making spark/conf/spark-defaults.conf changes take effect

2014-05-18 Thread Daniel Mahler
I am running in an aws ec2 cluster that i launched using the spark-ec2 script that comes with spark and I use the "-v master" option to run the head version. If I then log into master and make changes to spark/conf/spark-defaults.conf How do I make the changes take effect across the cluster? Is j