To be honest, my advice was just based on your post - we only use Hive in AWS 
EMR, so I couldn't tell you.

Glad that at least you're back up though.

----- Original Message -----
From: "Anup Tiwari" <anupsdtiw...@gmail.com>
To: user@hive.apache.org
Sent: Thursday, April 12, 2018 10:50:23 AM
Subject: Re: Unable to read hive external table data which is linked to s3 
after upgradation from 2.1.1 to 2.3.3








Hi Richard , 

After looking at hive 2.3.3 logs i found that when we are loading all 
configuration parameters then below message is getting printed :- 

Values omitted for security reason if present: [fs.s3n.awsAccessKeyId, 
fs.s3a.access.key, fs.s3.awsAccessKeyId, hive.server2.keystore.password, f 
s.s3a.proxy.password, javax.jdo.option.ConnectionPassword, 
fs.s3.awsSecretAccessKey, fs.s3n.awsSecretAccessKey, fs.s3a.secret.key] 


while in hive 2.1.1 logs i found below message :- 

Values omitted for security reason if present: [hive.server2.keystore.password, 
javax.jdo.option.ConnectionPassword] 

Can this be the reason why hive 2.3.3 is not able to read s3 related params 
from hive-site.xml? 

I found one of JIRA : https://issues.apache.org/jira/browse/HIVE-14588 



Also i have set below property in hive-site.xml by excluding above s3 
variables(fs.s3a.access.key,fs.s3a.secret.key) from list and it worked. 

<property> 
<name>hive.conf.hidden.list</name> 
<value>javax.jdo.option.ConnectionPassword,hive.server2.keystore.password,fs.s3.awsAccessKeyId,fs.s3.awsSecretAccessKey,fs.s3n.awsAccessKeyId,fs.s3n.awsSecretAccessKey,fs.s3a.proxy.password</value>
 
<description>Comma separated list of configuration options which should not be 
read by normal user like passwords.Anup has excluded 2 variable which were 
fs.s3a.access.key,fs.s3a.secret.key</description> 
</property> 

Let me know if there is any other solution because i think if these variables 
are by default part of hidden.list then there will be some other proper 
workaround for this. 









Regards, 
Anup Tiwari 


On Thu, Apr 12, 2018 at 7:44 PM, Richard A. Bross < r...@oaktreepeak.com > 
wrote: 


I hear you, but given the exception log, it does seem that it can't 
authenticate you. You can try using the AWS environment variables. If that 
resolves the issue then you'll have some more to go on. According to 
Hortonworks here: 

https://hortonworks.github.io/hdp-aws/s3-security/ 

"AWS CLI supports authentication through environment variables. These same 
environment variables will be used by Hadoop if no configuration properties are 
set." 


----- Original Message ----- 
From: "Anup Tiwari" < anupsdtiw...@gmail.com > 
To: user@hive.apache.org 
Sent: Thursday, April 12, 2018 10:06:33 AM 
Subject: Re: Unable to read hive external table data which is linked to s3 
after upgradation from 2.1.1 to 2.3.3 


We are not using EMR. Also we have set below params for accessing s3 bucket in 
hive-site.xml which are same as what we have set in hive 2.1.1. 


* fs.s3a.access.key 
* fs.s3a.secret.key 
* fs.s3a.connection.maximum 
* fs.s3a.impl 








Regards, 
Anup Tiwari 


On Thu, Apr 12, 2018 at 7:19 PM, Richard A. Bross < r...@oaktreepeak.com > 
wrote: 


Based on the exception, it looks more like an AWS credentials issue than a Hive 
issue. Are you running in AWS EMR, on-prem? 

In AWS the resource accessing the S3 bucket would have to have an IAM that gave 
permission. If you are running somewhere else whatever AWS login you use would 
have to have the correct permissions in the IAM. 



----- Original Message ----- 
From: "Anup Tiwari" < anupsdtiw...@gmail.com > 
To: user@hive.apache.org 
Sent: Thursday, April 12, 2018 9:11:37 AM 
Subject: Unable to read hive external table data which is linked to s3 after 
upgradation from 2.1.1 to 2.3.3 



Hi All, 

When i am trying to read s3 linked external table in 2.3.3 ; i am getting 
errors.It was working properly in 2.1.1. please find below details and let me 
know if i am missing something :- 


Hadoop Version :- 2.8.0 

Query :- 

select log_date,count(1) as cnt from test.tt1 group by log_date; 

Error :- 

Vertex failed, vertexName=Map 1, vertexId=vertex_1523502631429_0029_3_00, 
diagnostics=[Vertex vertex_1523502631429_0029_3_00 [Map 1] killed/failed due 
to:ROOT_INPUT_INIT_FAILURE, Vertex Input: tt1 initializer failed, 
vertex=vertex_1523502631429_0029_3_00 [Map 1], 
org.apache.hadoop.fs.s3a.AWSClientIOException: doesBucketExist on 
g24x7.new-analytics: com.amazonaws.AmazonClientException: No AWS Credentials 
provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider 
SharedInstanceProfileCredentialsProvider : com.amazonaws.AmazonClientException: 
The requested metadata is not found at 
http://16x.xxx.xxx.xx4/latest/meta-data/iam/security-credentials/ : No AWS 
Credentials provided by BasicAWSCredentialsProvider 
EnvironmentVariableCredentialsProvider SharedInstanceProfileCredentialsProvider 
: com.amazonaws.AmazonClientException: The requested metadata is not found at 
http://16x.xxx.xxx.xx4/latest/meta-data/iam/security-credentials/ 
at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:128) 
at 
org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(S3AFileSystem.java:288)
 
at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:236) 
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2811) 
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100) 
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2848) 
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2830) 
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389) 
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:356) 
at 
org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:265)
 
at 
org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:236) 
at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:322) 
at org.apache.hadoop.hive.ql.io 
.HiveInputFormat.addSplitsForGroup(HiveInputFormat.java:442) 
at org.apache.hadoop.hive.ql.io 
.HiveInputFormat.getSplits(HiveInputFormat.java:561) 


at 
org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.initialize(HiveSplitGenerator.java:196)
 
at 
org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:278)
 
at 
org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable$1.run(RootInputInitializerManager.java:269)
 
at java.security.AccessController.doPrivileged(Native Method) 
at javax.security.auth.Subject.doAs(Subject.java:422) 
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
 
at 
org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:269)
 
at 
org.apache.tez.dag.app.dag.RootInputInitializerManager$InputInitializerCallable.call(RootInputInitializerManager.java:253)
 
at java.util.concurrent.FutureTask.run(FutureTask.java:266) 
at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
at java.lang.Thread.run(Thread.java:745) 
Caused by: com.amazonaws.AmazonClientException: No AWS Credentials provided by 
BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider 
SharedInstanceProfileCredentialsProvider : com.amazonaws.AmazonClientException: 
The requested metadata is not found at 
http://16x.xxx.xxx.xx4/latest/meta-data/iam/security-credentials/ 
at 
org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:151)
 
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3779) 
at 
com.amazonaws.services.s3.AmazonS3Client.headBucket(AmazonS3Client.java:1107) 
at 
com.amazonaws.services.s3.AmazonS3Client.doesBucketExist(AmazonS3Client.java:1070)
 
at 
org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(S3AFileSystem.java:276)
 
... 24 more 
Caused by: com.amazonaws.AmazonClientException: The requested metadata is not 
found at http://16x.xxx.xxx.xx4/latest/meta-data/iam/security-credentials/ 
at 
com.amazonaws.internal.EC2MetadataClient.readResponse(EC2MetadataClient.java:111)
 
at 
com.amazonaws.internal.EC2MetadataClient.readResource(EC2MetadataClient.java:92)
 
at 
com.amazonaws.internal.EC2MetadataClient.getDefaultCredentials(EC2MetadataClient.java:55)
 
at 
com.amazonaws.auth.InstanceProfileCredentialsProvider.loadCredentials(InstanceProfileCredentialsProvider.java:186)
 
at 
com.amazonaws.auth.InstanceProfileCredentialsProvider.getCredentials(InstanceProfileCredentialsProvider.java:124)
 
at 
org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(AWSCredentialProviderList.java:129)
 
... 28 more 
] 









Regards, 
Anup Tiwari 

Reply via email to