Did you try with s3a? Also make sure your key does not have any wildcard
chars in it.

Thanks
Best Regards

On Fri, Aug 21, 2015 at 2:03 AM, Shuai Zheng <szheng.c...@gmail.com> wrote:

> Hi All,
>
>
>
> I try to access S3 file from S3 in Hadoop file format:
>
>
>
> Below is my code:
>
>
>
>                      Configuration hadoopConf = ctx.hadoopConfiguration();
>
>                      hadoopConf.set("fs.s3n.awsAccessKeyId", *this*
> .getAwsAccessKeyId());
>
>                      hadoopConf.set("fs.s3n.awsSecretAccessKey", *this*
> .getAwsSecretAccessKey());
>
>                      lines = ctx.newAPIHadoopFile(inputPath,
> NonSplitableTextInputFormat.*class*, LongWritable.*class*, Text.*class*,
> hadoopConf).values()
>
>                                   .map(*new* *Function<Text, String>()* {
>
>                                          @Override
>
>                                          *public* String call(Text arg0)
> *throws* Exception {
>
>                                                 *return* arg0.*toString*
> ();
>
>                                          }
>
>                                   });
>
> And I have below error:
>
>
>
> Exception in thread "main"
> org.apache.hadoop.security.AccessControlException: Permission denied:
> s3n://************
>
>         at
> org.apache.hadoop.fs.s3native.Jets3tNativeFileSystemStore.processException(Jets3tNativeFileSystemStore.java:449)
>
>         at
> org.apache.hadoop.fs.s3native.Jets3tNativeFileSystemStore.processException(Jets3tNativeFileSystemStore.java:427)
>
>
>
> The permission should not have any problem (because I can use ctx.textFile
> without any issue). So the issue from the call: newAPIHadoopFile
>
>
>
> Anything else I need to setup for this?
>
>
>
> Regards,
>
>
>
> Shuai
>

Reply via email to