[ 
https://issues.apache.org/jira/browse/SPARK-5668?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-5668.
------------------------------
       Resolution: Fixed
    Fix Version/s: 1.4.0

Issue resolved by pull request 4457
[https://github.com/apache/spark/pull/4457]

> spark_ec2.py region parameter could be either mandatory or its value displayed
> ------------------------------------------------------------------------------
>
>                 Key: SPARK-5668
>                 URL: https://issues.apache.org/jira/browse/SPARK-5668
>             Project: Spark
>          Issue Type: Improvement
>          Components: EC2
>    Affects Versions: 1.2.0, 1.3.0, 1.4.0
>            Reporter: Miguel Peralvo
>            Priority: Minor
>              Labels: starter
>             Fix For: 1.4.0
>
>
> If the region parameter is not specified when invoking spark-ec2 
> (spark-ec2.py behind the scenes) it defaults to us-east-1. When the cluster 
> doesn't belong to that region, after showing the "Searching for existing 
> cluster Spark..." message, it causes an "ERROR: Could not find any existing 
> cluster" exception because it doesn't find you cluster in the default region.
> As it doesn't tell you anything about the region, It can be a small headache 
> for new users.
> In 
> http://stackoverflow.com/questions/21171576/why-does-spark-ec2-fail-with-error-could-not-find-any-existing-cluster,
>  Dmitriy Selivanov explains it.
> I propose that:
> 1. Either we make the search message a little bit more informative with 
> something like "Searching for existing cluster Spark in region " + 
> opts.region.
> 2. Or we remove the us-east-1 as default and make the --region parameter 
> mandatory.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to