[
https://issues.apache.org/jira/browse/SQOOP-3385?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16622258#comment-16622258
]
Steve Loughran commented on SQOOP-3385:
---------------------------------------
# AWS environment variables will be picked up; its safer to use that than place
your AWS secrets on the command line, where they are visible to anyone on your
machine who can run {{ps}}
# don't try and mix hadoop library or AWS SDK versions. Make sure that hadoop-*
JARs are in sync and the aws-sdj libraries are exactly that from the same
hadoop release.
# ConnectionReset is a TCP network error. If its not proxies, it'll be
endpoints, https vs http.
> Error while connecting to S3 using Sqoop
> ----------------------------------------
>
> Key: SQOOP-3385
> URL: https://issues.apache.org/jira/browse/SQOOP-3385
> Project: Sqoop
> Issue Type: Bug
> Components: connectors
> Affects Versions: 1.4.7
> Reporter: Suchit
> Priority: Minor
> Labels: S3
>
> I am facing an issue while trying to import file from On Prem DB to S3 using
> Sqoop.
> Things I an able to do-
> 1- I am connected to S3 , able to run aws s3 ls & other AWS cli commands
> 2- Able to generate a file connecting to DB to local Unix box.
> But when I change the target directory to S3 instead of locat I am getting
> below error-
>
> "ERROR tool.ImportTool: Import failed: AWS Access Key ID and Secret Access
> Key must be specified as the username or password (respectively) of a s3 URL,
> or by setting the fs.s3.awsAccessKeyId or fs.s3.awsSecretAccessKey properties
> (respectively)."
>
> Ideally the Sqoop installation should be able to pick up the credentials
> from credential file inside .aws directory of the user running the command
> but Is there a way I can specify the credentials?
>
> Thanks in advance.
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)