Try the following:

1. Set the access key and secret key in the sparkContext:

sparkContext.set("
> ​
> AWS_ACCESS_KEY_ID",yourAccessKey)

sparkContext.set("
> ​
> AWS_SECRET_ACCESS_KEY",yourSecretKey)


2. Set the access key and secret key in the environment before starting
your application:

​
>
export
> ​​
> AWS_ACCESS_KEY_ID=<your access>

export
> ​​
> AWS_SECRET_ACCESS_KEY=<your secret>​


3. Set the access key and secret key inside the hadoop configurations

val hadoopConf=sparkContext.hadoopConfiguration;
>
> hadoopConf.set("fs.s3.impl",
>> "org.apache.hadoop.fs.s3native.NativeS3FileSystem")
>
> hadoopConf.set("fs.s3.awsAccessKeyId",yourAccessKey)
>
> hadoopConf.set("fs.s3.awsSecretAccessKey",yourSecretKey)
>
>
4. You can also try:

val lines =

​s
> parkContext.textFile("s3n://yourAccessKey:yourSecretKey@
> <yourBucket>/path/")


Thanks
Best Regards

On Mon, Oct 13, 2014 at 11:33 PM, Ranga <sra...@gmail.com> wrote:

> Hi
>
> I am trying to access files/buckets in S3 and encountering a permissions
> issue. The buckets are configured to authenticate using an IAMRole provider.
> I have set the KeyId and Secret using environment variables (
> AWS_SECRET_ACCESS_KEY and AWS_ACCESS_KEY_ID). However, I am still unable
> to access the S3 buckets.
>
> Before setting the access key and secret the error was: 
> "java.lang.IllegalArgumentException:
> AWS Access Key ID and Secret Access Key must be specified as the username
> or password (respectively) of a s3n URL, or by setting the
> fs.s3n.awsAccessKeyId or fs.s3n.awsSecretAccessKey properties
> (respectively)."
>
> After setting the access key and secret, the error is: "The AWS Access
> Key Id you provided does not exist in our records."
>
> The id/secret being set are the right values. This makes me believe that
> something else ("token", etc.) needs to be set as well.
> Any help is appreciated.
>
>
> - Ranga
>

Reply via email to