Hi Shivaram,

OK so let's assume the script CANNOT take a different user and that it must
be 'root'. The typical workaround is as you said, allow the ssh with the
root user. Now, don't laugh, but, this worked last Friday, but today
(Monday) it no longer works. :D Why? ...

...It seems that NOW, when you launch a 'paravirtual' ami, the root user's
'authorized_keys' file is always overwritten. This means the workaround
doesn't work anymore! I would LOVE for someone to verify this.

Just to point out, I am trying to make this work with a paravirtual
instance and not an HVM instance.

Please and thanks,
Marco.


On Mon, Apr 7, 2014 at 4:40 PM, Shivaram Venkataraman <
shivaram.venkatara...@gmail.com> wrote:

> Right now the spark-ec2 scripts assume that you have root access and a lot
> of internal scripts assume have the user's home directory hard coded as
> /root.   However all the Spark AMIs we build should have root ssh access --
> Do you find this not to be the case ?
>
> You can also enable root ssh access in a vanilla AMI by editing
> /etc/ssh/sshd_config and setting "PermitRootLogin" to yes
>
> Thanks
> Shivaram
>
>
>
> On Mon, Apr 7, 2014 at 11:14 AM, Marco Costantini <
> silvio.costant...@granatads.com> wrote:
>
>> Hi all,
>> On the old Amazon Linux EC2 images, the user 'root' was enabled for ssh.
>> Also, it is the default user for the Spark-EC2 script.
>>
>> Currently, the Amazon Linux images have an 'ec2-user' set up for ssh
>> instead of 'root'.
>>
>> I can see that the Spark-EC2 script allows you to specify which user to
>> log in with, but even when I change this, the script fails for various
>> reasons. And the output SEEMS that the script is still based on the
>> specified user's home directory being '/root'.
>>
>> Am I using this script wrong?
>> Has anyone had success with this 'ec2-user' user?
>> Any ideas?
>>
>> Please and thank you,
>> Marco.
>>
>
>

Reply via email to