Hi,

are you using EC2 instances or local cluster behind firewall.


Regards,
Gourav Sengupta

On Wed, Jun 8, 2016 at 4:34 PM, Daniel Haviv <
daniel.ha...@veracity-group.com> wrote:

> Hi,
>
> I'm trying to create a table on s3a but I keep hitting the following error:
>
> Exception in thread "main"
> org.apache.hadoop.hive.ql.metadata.HiveException: 
> MetaException(*message:com.cloudera.com.amazonaws.AmazonClientException:
> Unable to load AWS credentials from any provider in the chain*)
>
>
>
> I tried setting the s3a keys using the configuration object but I might be
> hitting SPARK-11364 <https://issues.apache.org/jira/browse/SPARK-11364> :
>
> conf.set("fs.s3a.access.key", accessKey)
> conf.set("fs.s3a.secret.key", secretKey)
> conf.set("spark.hadoop.fs.s3a.access.key",accessKey)
> conf.set("spark.hadoop.fs.s3a.secret.key",secretKey)
>
> val sc = new SparkContext(conf)
>
>
>
> I tried setting these propeties in hdfs-site.xml but i'm still getting
> this error.
>
> Finally I tried to set the AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY
> environment variables but with no luck.
>
>
>
> Any ideas on how to resolve this issue ?
>
>
>
> Thank you.
>
> Daniel
>
> Thank you.
> Daniel
>

Reply via email to