Marco,
If you call spark-ec2 launch without specifying an AMI, it will default to
the Spark-provided AMI.
Nick
On Wed, Apr 9, 2014 at 9:43 AM, Marco Costantini
silvio.costant...@granatads.com wrote:
Hi there,
To answer your question; no there is no reason NOT to use an AMI that
Spark has
And for the record, that AMI is ami-35b1885c. Again, you don't need to
specify it explicitly; spark-ec2 will default to it.
On Wed, Apr 9, 2014 at 11:08 AM, Nicholas Chammas
nicholas.cham...@gmail.com wrote:
Marco,
If you call spark-ec2 launch without specifying an AMI, it will default to
Ah, tried that. I believe this is an HVM AMI? We are exploring paravirtual
AMIs.
On Wed, Apr 9, 2014 at 11:17 AM, Nicholas Chammas
nicholas.cham...@gmail.com wrote:
And for the record, that AMI is ami-35b1885c. Again, you don't need to
specify it explicitly; spark-ec2 will default to it.
The AMI should automatically switch between PVM and HVM based on the
instance type you specify on the command line. For reference (note you
don't need to specify this on the command line), the PVM ami id
is ami-5bb18832 in us-east-1.
FWIW we maintain the list of AMI Ids (across regions and pvm,
Another thing I didn't mention. The AMI and user used: naturally I've
created several of my own AMIs with the following characteristics. None of
which worked.
1) Enabling ssh as root as per this guide (
http://blog.tiger-workshop.com/enable-root-access-on-amazon-ec2-instance/).
When doing this, I
I was able to keep the workaround ...around... by overwriting the
generated '/root/.ssh/authorized_keys' file with a known good one, in the
'/etc/rc.local' file
On Tue, Apr 8, 2014 at 10:12 AM, Marco Costantini
silvio.costant...@granatads.com wrote:
Another thing I didn't mention. The AMI and
Hi all,
On the old Amazon Linux EC2 images, the user 'root' was enabled for ssh.
Also, it is the default user for the Spark-EC2 script.
Currently, the Amazon Linux images have an 'ec2-user' set up for ssh
instead of 'root'.
I can see that the Spark-EC2 script allows you to specify which user to
Hi Shivaram,
OK so let's assume the script CANNOT take a different user and that it must
be 'root'. The typical workaround is as you said, allow the ssh with the
root user. Now, don't laugh, but, this worked last Friday, but today
(Monday) it no longer works. :D Why? ...
...It seems that NOW,
Hmm -- That is strange. Can you paste the command you are using to launch
the instances ? The typical workflow is to use the spark-ec2 wrapper script
using the guidelines at http://spark.apache.org/docs/latest/ec2-scripts.html
Shivaram
On Mon, Apr 7, 2014 at 1:53 PM, Marco Costantini