be much appreciated!
Sam
- Original Message -
From: Krishna Sankar ksanka...@gmail.com
To: user@spark.apache.org
Sent: Wednesday, June 4, 2014 8:52:59 AM
Subject: Re: Trouble launching EC2 Cluster with Spark
One reason could be that the keys are in a different region. Need to create
the keys
make the permissions more private?
Thanks very much,
Sam
- Original Message -
From: Sam Taylor Steyer sste...@stanford.edu
To: user@spark.apache.org
Sent: Wednesday, June 4, 2014 12:42:04 PM
Subject: Re: Trouble launching EC2 Cluster with Spark
Thanks you! The regions advice
Awesome, that worked. Thank you!
- Original Message -
From: Krishna Sankar ksanka...@gmail.com
To: user@spark.apache.org
Sent: Wednesday, June 4, 2014 12:52:00 PM
Subject: Re: Trouble launching EC2 Cluster with Spark
chmod 600 path/FinalKey.pem
Cheers
k/
On Wed, Jun 4, 2014 at 12:49
Taylor Steyer sste...@stanford.edu wrote:
Awesome, that worked. Thank you!
- Original Message -
From: Krishna Sankar ksanka...@gmail.com
To: user@spark.apache.org
Sent: Wednesday, June 4, 2014 12:52:00 PM
Subject: Re: Trouble launching EC2 Cluster with Spark
chmod 600 path
Dear PJ$,
If you are familiar with Puppet, you could try using the puppet module I wrote
(currently for Spark 0.9.0, I custom compiled it since no Debian package was
available at the time I started with a project I required it for).
https://github.com/stefanvanwouw/puppet-spark
---
Kind
Running on a few m3.larges with the ami-848a6eec image (debian 7). Haven't
gotten any further. No clue what's wrong. I'd really appreciate any
guidance y'all could offer.
Best,
PJ$
On Sat, May 31, 2014 at 1:40 PM, Matei Zaharia matei.zaha...@gmail.com
wrote:
What instance types did you launch
So to run spark-ec2, you should use the default AMI that it launches with if
you don’t pass -a. Those are based on Amazon Linux, not Debian. Passing your
own AMI is an advanced option but people need to install some stuff on their
AMI in advance for it to work with our scripts.
Matei
On Jun
Ha yes,,, I just went through this.
(a) You have to use the ;'default' spark AMI, ( ami-7a320f3f at the moment
) and not any of the other linux distros. They don't work.
(b) Start with m1.large instances.. I tried going for r3.large at first,
and had no end of self-caused trouble. m1.large works.
What instance types did you launch on?
Sometimes you also get a bad individual machine from EC2. It might help to
remove the node it’s complaining about from the conf/slaves file.
Matei
On May 30, 2014, at 11:18 AM, PJ$ p...@chickenandwaffl.es wrote:
Hey Folks,
I'm really having quite a
Hey Folks,
I'm really having quite a bit of trouble getting spark running on ec2. I'm
not using scripts the https://github.com/apache/spark/tree/master/ec2
because I'd like to know how everything works. But I'm going a little
crazy. I think that something about the networking configuration must
10 matches
Mail list logo