[jira] [Commented] (SPARK-6935) spark/spark-ec2.py add parameters to give different instance types for master and slaves

2015-04-16 Thread Oleksii Mandrychenko (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6935?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14497861#comment-14497861
 ] 

Oleksii Mandrychenko commented on SPARK-6935:
-

I didn't know there was a `--master-instance-type` option in the 1.3 release. I 
am still using 1.2 release, which doesn't have it.

I guess if it is present then this ticket can be closed as fixed in 1.3.

Refactoring should probably be another ticket IMO

 spark/spark-ec2.py add parameters to give different instance types for master 
 and slaves
 

 Key: SPARK-6935
 URL: https://issues.apache.org/jira/browse/SPARK-6935
 Project: Spark
  Issue Type: Improvement
  Components: EC2
Affects Versions: 1.3.0
Reporter: Oleksii Mandrychenko
Priority: Minor
  Labels: easyfix
   Original Estimate: 24h
  Remaining Estimate: 24h

 I want to start a cluster where I give beefy AWS instances to slaves, such as 
 memory-optimised R3, but master is not really performing much number 
 crunching work. So it is a waste to allocate a powerful instance for master, 
 where a regular one would suffice.
 Suggested syntax:
 {code}
 sh spark-ec2 --instance-type-slave=instance_type # applies to slaves 
 only 
  --instance-type-master=instance_type# applies to master 
 only
  --instance-type=instance_type   # default, applies to 
 both
 # in real world
 sh spark-ec2 --instance-type-slave=r3.2xlarge --instance-type-master=c3.xlarge
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6935) spark/spark-ec2.py add parameters to give different instance types for master and slaves

2015-04-16 Thread Oleksii Mandrychenko (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6935?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14497885#comment-14497885
 ] 

Oleksii Mandrychenko commented on SPARK-6935:
-

Oh... It's actually there in 1.2 release as well. I did not notice it, because 
documentation is missing on this point

https://spark.apache.org/docs/1.2.1/ec2-scripts.html

Perhaps somebody can go through the available options in the script and double 
check that documentation has matching entries.

 spark/spark-ec2.py add parameters to give different instance types for master 
 and slaves
 

 Key: SPARK-6935
 URL: https://issues.apache.org/jira/browse/SPARK-6935
 Project: Spark
  Issue Type: Improvement
  Components: EC2
Affects Versions: 1.3.0
Reporter: Oleksii Mandrychenko
Priority: Minor
  Labels: easyfix
   Original Estimate: 24h
  Remaining Estimate: 24h

 I want to start a cluster where I give beefy AWS instances to slaves, such as 
 memory-optimised R3, but master is not really performing much number 
 crunching work. So it is a waste to allocate a powerful instance for master, 
 where a regular one would suffice.
 Suggested syntax:
 {code}
 sh spark-ec2 --instance-type-slave=instance_type # applies to slaves 
 only 
  --instance-type-master=instance_type# applies to master 
 only
  --instance-type=instance_type   # default, applies to 
 both
 # in real world
 sh spark-ec2 --instance-type-slave=r3.2xlarge --instance-type-master=c3.xlarge
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-6935) spark/spark-ec2.py add parameters to give different instance types for master and slaves

2015-04-15 Thread Oleksii Mandrychenko (JIRA)
Oleksii Mandrychenko created SPARK-6935:
---

 Summary: spark/spark-ec2.py add parameters to give different 
instance types for master and slaves
 Key: SPARK-6935
 URL: https://issues.apache.org/jira/browse/SPARK-6935
 Project: Spark
  Issue Type: Improvement
  Components: EC2
Affects Versions: 1.3.0
Reporter: Oleksii Mandrychenko
Priority: Minor


I want to start a cluster where I give beefy AWS instances to slaves, such as 
memory-optimised R3, but master is not really performing much number crunching 
work. So it is a waste to allocate a powerful instance for master, where a 
regular one would suffice.

Suggested syntax:

{code}
sh spark-ec2 --instance-type-slave=instance_type 
 --instance-type-master=instance_type

# in real world
sh spark-ec2 --instance-type-slave=r3.2xlarge --instance-type-master=c3.xlarge
{code}




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-6935) spark/spark-ec2.py add parameters to give different instance types for master and slaves

2015-04-15 Thread Oleksii Mandrychenko (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-6935?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14496453#comment-14496453
 ] 

Oleksii Mandrychenko commented on SPARK-6935:
-

Added support for default flag in description

 spark/spark-ec2.py add parameters to give different instance types for master 
 and slaves
 

 Key: SPARK-6935
 URL: https://issues.apache.org/jira/browse/SPARK-6935
 Project: Spark
  Issue Type: Improvement
  Components: EC2
Affects Versions: 1.3.0
Reporter: Oleksii Mandrychenko
Priority: Minor
  Labels: easyfix
   Original Estimate: 24h
  Remaining Estimate: 24h

 I want to start a cluster where I give beefy AWS instances to slaves, such as 
 memory-optimised R3, but master is not really performing much number 
 crunching work. So it is a waste to allocate a powerful instance for master, 
 where a regular one would suffice.
 Suggested syntax:
 {code}
 sh spark-ec2 --instance-type-slave=instance_type # applies to slaves 
 only 
  --instance-type-master=instance_type# applies to master 
 only
  --instance-type=instance_type   # default, applies to 
 both
 # in real world
 sh spark-ec2 --instance-type-slave=r3.2xlarge --instance-type-master=c3.xlarge
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-6935) spark/spark-ec2.py add parameters to give different instance types for master and slaves

2015-04-15 Thread Oleksii Mandrychenko (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-6935?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Oleksii Mandrychenko updated SPARK-6935:

Description: 
I want to start a cluster where I give beefy AWS instances to slaves, such as 
memory-optimised R3, but master is not really performing much number crunching 
work. So it is a waste to allocate a powerful instance for master, where a 
regular one would suffice.

Suggested syntax:

{code}
sh spark-ec2 --instance-type-slave=instance_type # applies to slaves only 
 --instance-type-master=instance_type# applies to master only
 --instance-type=instance_type   # default, applies to 
both

# in real world
sh spark-ec2 --instance-type-slave=r3.2xlarge --instance-type-master=c3.xlarge
{code}


  was:
I want to start a cluster where I give beefy AWS instances to slaves, such as 
memory-optimised R3, but master is not really performing much number crunching 
work. So it is a waste to allocate a powerful instance for master, where a 
regular one would suffice.

Suggested syntax:

{code}
sh spark-ec2 --instance-type-slave=instance_type 
 --instance-type-master=instance_type

# in real world
sh spark-ec2 --instance-type-slave=r3.2xlarge --instance-type-master=c3.xlarge
{code}



 spark/spark-ec2.py add parameters to give different instance types for master 
 and slaves
 

 Key: SPARK-6935
 URL: https://issues.apache.org/jira/browse/SPARK-6935
 Project: Spark
  Issue Type: Improvement
  Components: EC2
Affects Versions: 1.3.0
Reporter: Oleksii Mandrychenko
Priority: Minor
  Labels: easyfix
   Original Estimate: 24h
  Remaining Estimate: 24h

 I want to start a cluster where I give beefy AWS instances to slaves, such as 
 memory-optimised R3, but master is not really performing much number 
 crunching work. So it is a waste to allocate a powerful instance for master, 
 where a regular one would suffice.
 Suggested syntax:
 {code}
 sh spark-ec2 --instance-type-slave=instance_type # applies to slaves 
 only 
  --instance-type-master=instance_type# applies to master 
 only
  --instance-type=instance_type   # default, applies to 
 both
 # in real world
 sh spark-ec2 --instance-type-slave=r3.2xlarge --instance-type-master=c3.xlarge
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org