[ 
https://issues.apache.org/jira/browse/HADOOP-6681?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12854098#action_12854098
 ] 

Tom White commented on HADOOP-6681:
-----------------------------------

Passing these credentials automatically may not be what the user wants, and may 
in fact be a security liability (since the credentials are passed to the 
cluster, which may be shared with other users). You can achieve the same result 
explicitly with the following command line arguments:

{code}
--env AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID \ 
--env AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY
{code}

Does this solve your issue?

> Fill AWS credentials when configuring Hadoop on EC2 instances
> -------------------------------------------------------------
>
>                 Key: HADOOP-6681
>                 URL: https://issues.apache.org/jira/browse/HADOOP-6681
>             Project: Hadoop Common
>          Issue Type: Improvement
>          Components: contrib/cloud
>            Reporter: Andrew Klochkov
>         Attachments: HADOOP-6681.patch
>
>
> There's a function "configure_hadoop" in the hadoop-ec2-init-remote.sh script 
> used to configure EC2 nodes for Hadoop. The function actually uses 
> AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables, but they 
> are never passed to it. It can be fixed in service.py by passing those 
> variables.

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to