Is there a way to use the ec2 launch script with a locally built version of 
spark? I launch and destroy clusters pretty frequently and would like to not 
have to wait each time for the master instance to compile the source as happens 
when I set the -v tag with the latest git commit. To be clear, I would like to 
launch a non-release version of spark compiled locally as quickly as I can 
launch a release version (e.g. -v 1.2.0) which does not have to be compiled 
upon launch.

Up to this point, I have just used the launch script included with the latest 
release to set up the cluster and then manually replaced the assembly file on 
the master and slaves with the version I built locally and then stored on s3. 
Is there anything wrong with doing it this way? Further, is there a better or 
more standard way of accomplishing this?
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to