Thanks,

I also use Spark 1.2 with prebuilt for Hadoop 2.4. I launch both 1.1 and
1.2 with the same command:

./spark-ec2 -k foo -i bar.pem launch mycluster

By default this launches in us-east-1. I tried changing the the region
using:

-r us-west-1 but that had the same result:

Could not resolve AMI at:
https://raw.github.com/mesos/spark-ec2/v4/ami-list/us-west-1/pvm

Looking up https://raw.github.com/mesos/spark-ec2/v4/ami-list/us-west-1/pvm
in a browser results in the same AMI ID as yours. If I search
for ami-7a320f3f AMI in AWS, I can't find any such image. I tried searching
in all regions I could find in the AWS console.

The AMI for 1.1 is spark.ami.pvm.v9 (ami-5bb18832). I can find that AMI in
us-west-1.

Strange. Not sure what to do.

/Håkan

On Mon Jan 26 2015 at 9:02:42 AM Charles Feduke <charles.fed...@gmail.com>
wrote:

I definitely have Spark 1.2 running within EC2 using the spark-ec2 scripts.
I downloaded Spark 1.2 with prebuilt for Hadoop 2.4 and later.

What parameters are you using when you execute spark-ec2?


I am launching in the us-west-1 region (ami-7a320f3f) which may explain
things.

On Mon Jan 26 2015 at 2:40:01 AM hajons <haj...@gmail.com> wrote:

Hi,

When I try to launch a standalone cluster on EC2 using the scripts in the
ec2 directory for Spark 1.2, I get the following error:

Could not resolve AMI at:
https://raw.github.com/mesos/spark-ec2/v4/ami-list/us-east-1/pvm

It seems there is not yet any AMI available on EC2. Any ideas when there
will be one?

This works without problems for version 1.1. Starting up a cluster using
these scripts is so simple and straightforward, so I am really missing it on
1.2.

/Håkan





--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/No-AMI-for-Spark-1-2-using-ec2-scripts-tp21362.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to