Sweet, thanks for the instructions. This will do for resizing a dev cluster
that you can bring down at will.

I will open a JIRA issue about adding the functionality I described to
spark-ec2.


On Fri, Apr 4, 2014 at 3:43 PM, Matei Zaharia <matei.zaha...@gmail.com>wrote:

> This can't be done through the script right now, but you can do it
> manually as long as the cluster is stopped. If the cluster is stopped, just
> go into the AWS Console, right click a slave and choose "launch more of
> these" to add more. Or select multiple slaves and delete them. When you run
> spark-ec2 start the next time to start your cluster, it will set it up on
> all the machines it finds in the mycluster-slaves security group.
>
> This is pretty hacky so it would definitely be good to add this feature;
> feel free to open a JIRA about it.
>
> Matei
>
> On Apr 4, 2014, at 12:16 PM, Nicholas Chammas <nicholas.cham...@gmail.com>
> wrote:
>
> I would like to be able to use spark-ec2 to launch new slaves and add them
> to an existing, running cluster. Similarly, I would also like to remove
> slaves from an existing cluster.
>
> Use cases include:
>
>    1. Oh snap, I sized my cluster incorrectly. Let me add/remove some
>    slaves.
>    2. During scheduled batch processing, I want to add some new slaves,
>    perhaps on spot instances. When that processing is done, I want to kill
>    them. (Cruel, I know.)
>
> I gather this is not possible at the moment. spark-ec2 appears to be able
> to launch new slaves for an existing cluster only if the master is stopped.
> I also do not see any ability to remove slaves from a cluster.
>
> Is that correct? Are there plans to add such functionality to spark-ec2 in
> the future?
>
> Nick
>
>
> ------------------------------
> View this message in context: Having spark-ec2 join new slaves to
> existing 
> cluster<http://apache-spark-user-list.1001560.n3.nabble.com/Having-spark-ec2-join-new-slaves-to-existing-cluster-tp3783.html>
> Sent from the Apache Spark User List mailing list 
> archive<http://apache-spark-user-list.1001560.n3.nabble.com/>at
> Nabble.com.
>
>
>

Reply via email to