[ 
https://issues.apache.org/jira/browse/SPARK-6935?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14497852#comment-14497852
 ] 

Yu Ishikawa commented on SPARK-6935:
------------------------------------

That's reasonable. Spark v1.3.0 has the `\-\-master-instance-type` option, not 
`\-\-instance-type-master`. Do you mean that we should add other new options, 
deprecating the current option?

And I feel like refactoring the script because it has many functions with long 
lines. So it is a little hard to maintain the source code. Just a comment.

> spark/spark-ec2.py add parameters to give different instance types for master 
> and slaves
> ----------------------------------------------------------------------------------------
>
>                 Key: SPARK-6935
>                 URL: https://issues.apache.org/jira/browse/SPARK-6935
>             Project: Spark
>          Issue Type: Improvement
>          Components: EC2
>    Affects Versions: 1.3.0
>            Reporter: Oleksii Mandrychenko
>            Priority: Minor
>              Labels: easyfix
>   Original Estimate: 24h
>  Remaining Estimate: 24h
>
> I want to start a cluster where I give beefy AWS instances to slaves, such as 
> memory-optimised R3, but master is not really performing much number 
> crunching work. So it is a waste to allocate a powerful instance for master, 
> where a regular one would suffice.
> Suggested syntax:
> {code}
> sh spark-ec2 --instance-type-slave=<instance_type>     # applies to slaves 
> only 
>              --instance-type-master=<instance_type>    # applies to master 
> only
>              --instance-type=<instance_type>           # default, applies to 
> both
> # in real world
> sh spark-ec2 --instance-type-slave=r3.2xlarge --instance-type-master=c3.xlarge
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to