Re: HiveContext: Unable to load AWS credentials from any provider in the chain

2016-06-10 Thread Daniel Haviv
I'm using EC2 instances 

Thank you.
Daniel

> On 9 Jun 2016, at 16:49, Gourav Sengupta  wrote:
> 
> Hi,
> 
> are you using EC2 instances or local cluster behind firewall.
> 
> 
> Regards,
> Gourav Sengupta
> 
>> On Wed, Jun 8, 2016 at 4:34 PM, Daniel Haviv 
>>  wrote:
>> Hi,
>> I'm trying to create a table on s3a but I keep hitting the following error:
>> Exception in thread "main" org.apache.hadoop.hive.ql.metadata.HiveException: 
>> MetaException(message:com.cloudera.com.amazonaws.AmazonClientException: 
>> Unable to load AWS credentials from any provider in the chain)
>>  
>> I tried setting the s3a keys using the configuration object but I might be 
>> hitting SPARK-11364 :
>> conf.set("fs.s3a.access.key", accessKey)
>> conf.set("fs.s3a.secret.key", secretKey)
>> conf.set("spark.hadoop.fs.s3a.access.key",accessKey)
>> conf.set("spark.hadoop.fs.s3a.secret.key",secretKey)
>> val sc = new SparkContext(conf)
>>  
>> I tried setting these propeties in hdfs-site.xml but i'm still getting this 
>> error.
>> Finally I tried to set the AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY 
>> environment variables but with no luck.
>>  
>> Any ideas on how to resolve this issue ?
>>  
>> Thank you.
>> Daniel
>> 
>> Thank you.
>> Daniel
> 


Re: HiveContext: Unable to load AWS credentials from any provider in the chain

2016-06-09 Thread Gourav Sengupta
Hi,

are you using EC2 instances or local cluster behind firewall.


Regards,
Gourav Sengupta

On Wed, Jun 8, 2016 at 4:34 PM, Daniel Haviv <
daniel.ha...@veracity-group.com> wrote:

> Hi,
>
> I'm trying to create a table on s3a but I keep hitting the following error:
>
> Exception in thread "main"
> org.apache.hadoop.hive.ql.metadata.HiveException: 
> MetaException(*message:com.cloudera.com.amazonaws.AmazonClientException:
> Unable to load AWS credentials from any provider in the chain*)
>
>
>
> I tried setting the s3a keys using the configuration object but I might be
> hitting SPARK-11364  :
>
> conf.set("fs.s3a.access.key", accessKey)
> conf.set("fs.s3a.secret.key", secretKey)
> conf.set("spark.hadoop.fs.s3a.access.key",accessKey)
> conf.set("spark.hadoop.fs.s3a.secret.key",secretKey)
>
> val sc = new SparkContext(conf)
>
>
>
> I tried setting these propeties in hdfs-site.xml but i'm still getting
> this error.
>
> Finally I tried to set the AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY
> environment variables but with no luck.
>
>
>
> Any ideas on how to resolve this issue ?
>
>
>
> Thank you.
>
> Daniel
>
> Thank you.
> Daniel
>


Re: HiveContext: Unable to load AWS credentials from any provider in the chain

2016-06-09 Thread Steve Loughran

On 9 Jun 2016, at 06:17, Daniel Haviv 
> wrote:

Hi,
I've set these properties both in core-site.xml and hdfs-site.xml with no luck.

Thank you.
Daniel


That's not good.

I'm afraid I don't know what version of s3a is in the cloudera release —I can 
see that the amazon stuff has been shaded, but don't know about the hadoop side 
and its auth.

One thing: can you try using s3n rather than s3a. I do think s3a is now better 
(and will be *really* good soon), but as s3n has been around for a long time, 
it's the baseline for functionality.

And I've just created some homework to do better logging of what's going on the 
s3a driver, though that bit of startup code in spark might interfere. 
https://issues.apache.org/jira/browse/HADOOP-13252


There's not much else i can do I'm afraid, not without patching your hadoop 
source and rebuilding things

-Steve






Re: HiveContext: Unable to load AWS credentials from any provider in the chain

2016-06-08 Thread Daniel Haviv
Hi,
I've set these properties both in core-site.xml and hdfs-site.xml with no luck.

Thank you.
Daniel

> On 9 Jun 2016, at 01:11, Steve Loughran  wrote:
> 
> 
>> On 8 Jun 2016, at 16:34, Daniel Haviv  
>> wrote:
>> 
>> Hi,
>> I'm trying to create a table on s3a but I keep hitting the following error:
>> Exception in thread "main" org.apache.hadoop.hive.ql.metadata.HiveException: 
>> MetaException(message:com.cloudera.com.amazonaws.AmazonClientException: 
>> Unable to load AWS credentials from any provider in the chain)
>>  
>> I tried setting the s3a keys using the configuration object but I might be 
>> hitting SPARK-11364 :
>> conf.set("fs.s3a.access.key", accessKey)
>> conf.set("fs.s3a.secret.key", secretKey)
>> conf.set("spark.hadoop.fs.s3a.access.key",accessKey)
>> conf.set("spark.hadoop.fs.s3a.secret.key",secretKey)
>> val sc = new SparkContext(conf)
>>  
>> I tried setting these propeties in hdfs-site.xml but i'm still getting this 
>> error.
> 
> 
> 
> try core-site.xml rather than hdfs-site.xml; the latter only gets loaded when 
> an HdfsConfiguration() instances is created; it may be a bit too late.
> 
>> Finally I tried to set the AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY 
>> environment variables but with no luck.
> 
> Those env vars aren't picked up directly by S3a (well, that was fixed over 
> the weekend https://issues.apache.org/jira/browse/HADOOP-12807  ); There's 
> some fixup in spark ( see 
> SparkHadoopUtil.appendS3AndSparkHadoopConfigurations() ); I don't know if 
> that is a factor; 
> 
>> Any ideas on how to resolve this issue ?
>>  
>> Thank you.
>> Daniel
>> 
>> Thank you.
>> Daniel
> 


Re: HiveContext: Unable to load AWS credentials from any provider in the chain

2016-06-08 Thread Steve Loughran

On 8 Jun 2016, at 16:34, Daniel Haviv 
> wrote:

Hi,
I'm trying to create a table on s3a but I keep hitting the following error:
Exception in thread "main" org.apache.hadoop.hive.ql.metadata.HiveException: 
MetaException(message:com.cloudera.com.amazonaws.AmazonClientException: Unable 
to load AWS credentials from any provider in the chain)



I tried setting the s3a keys using the configuration object but I might be 
hitting SPARK-11364 :

conf.set("fs.s3a.access.key", accessKey)
conf.set("fs.s3a.secret.key", secretKey)
conf.set("spark.hadoop.fs.s3a.access.key",accessKey)
conf.set("spark.hadoop.fs.s3a.secret.key",secretKey)

val sc = new SparkContext(conf)



I tried setting these propeties in hdfs-site.xml but i'm still getting this 
error.



try core-site.xml rather than hdfs-site.xml; the latter only gets loaded when 
an HdfsConfiguration() instances is created; it may be a bit too late.

Finally I tried to set the AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY environment 
variables but with no luck.




Those env vars aren't picked up directly by S3a (well, that was fixed over the 
weekend https://issues.apache.org/jira/browse/HADOOP-12807  ); There's some 
fixup in spark ( see SparkHadoopUtil.appendS3AndSparkHadoopConfigurations() ); 
I don't know if that is a factor;

Any ideas on how to resolve this issue ?



Thank you.
Daniel

Thank you.
Daniel



HiveContext: Unable to load AWS credentials from any provider in the chain

2016-06-08 Thread Daniel Haviv
Hi,
I'm trying to create a table on s3a but I keep hitting the following error:
Exception in thread "main" org.apache.hadoop.hive.ql.metadata.HiveException: 
MetaException(message:com.cloudera.com.amazonaws.AmazonClientException: Unable 
to load AWS credentials from any provider in the chain)
 
I tried setting the s3a keys using the configuration object but I might be 
hitting SPARK-11364 :
conf.set("fs.s3a.access.key", accessKey)
conf.set("fs.s3a.secret.key", secretKey)
conf.set("spark.hadoop.fs.s3a.access.key",accessKey)
conf.set("spark.hadoop.fs.s3a.secret.key",secretKey)
val sc = new SparkContext(conf)
 
I tried setting these propeties in hdfs-site.xml but i'm still getting this 
error.
Finally I tried to set the AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY environment 
variables but with no luck.
 
Any ideas on how to resolve this issue ?
 
Thank you.
Daniel

Thank you.
Daniel