We are not using EMR. Also we have set below params for accessing s3 bucket
in hive-site.xml which are same as what we have set in hive 2.1.1.


   - fs.s3a.access.key
   - fs.s3a.secret.key
   - fs.s3a.connection.maximum
   - fs.s3a.impl


Regards,
Anup Tiwari

On Thu, Apr 12, 2018 at 7:19 PM, Richard A. Bross <r...@oaktreepeak.com>
wrote:

> Based on the exception, it looks more like an AWS credentials issue than a
> Hive issue.  Are you running in AWS EMR, on-prem?
>
> In AWS the resource accessing the S3 bucket would have to have an IAM that
> gave permission.  If you are running somewhere else whatever AWS login you
> use would have to have the correct permissions in the IAM.
>
> ----- Original Message -----
> From: "Anup Tiwari" <anupsdtiw...@gmail.com>
> To: user@hive.apache.org
> Sent: Thursday, April 12, 2018 9:11:37 AM
> Subject: Unable to read hive external table data which is linked to s3
> after upgradation from 2.1.1 to 2.3.3
>
>
>
> Hi All,
>
> When i am trying to read s3 linked external table in 2.3.3 ; i am getting
> errors.It was working properly in 2.1.1. please find below details and let
> me know if i am missing something :-
>
>
> Hadoop Version :- 2.8.0
>
> Query :-
>
> select log_date,count(1) as cnt from test.tt1 group by log_date;
>
> Error :-
>
> Vertex failed, vertexName=Map 1, vertexId=vertex_1523502631429_0029_3_00,
> diagnostics=[Vertex vertex_1523502631429_0029_3_00 [Map 1] killed/failed
> due to:ROOT_INPUT_INIT_FAILURE, Vertex Input: tt1 initializer failed,
> vertex=vertex_1523502631429_0029_3_00 [Map 1], 
> org.apache.hadoop.fs.s3a.AWSClientIOException:
> doesBucketExist on g24x7.new-analytics: com.amazonaws.AmazonClientException:
> No AWS Credentials provided by BasicAWSCredentialsProvider
> EnvironmentVariableCredentialsProvider 
> SharedInstanceProfileCredentialsProvider
> : com.amazonaws.AmazonClientException: The requested metadata is not
> found at http://16x.xxx.xxx.xx4/latest/meta-data/iam/security-credentials/
> : No AWS Credentials provided by BasicAWSCredentialsProvider
> EnvironmentVariableCredentialsProvider 
> SharedInstanceProfileCredentialsProvider
> : com.amazonaws.AmazonClientException: The requested metadata is not
> found at http://16x.xxx.xxx.xx4/latest/meta-data/iam/security-credentials/
> at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:128)
> at org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(
> S3AFileSystem.java:288)
> at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(
> S3AFileSystem.java:236)
> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2811)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100)
> at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2848)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2830)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389)
> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:356)
> at org.apache.hadoop.mapred.FileInputFormat.singleThreadedListStatus(
> FileInputFormat.java:265)
> at org.apache.hadoop.mapred.FileInputFormat.listStatus(
> FileInputFormat.java:236)
> at org.apache.hadoop.mapred.FileInputFormat.getSplits(
> FileInputFormat.java:322)
> at org.apache.hadoop.hive.ql.io.HiveInputFormat.addSplitsForGroup(
> HiveInputFormat.java:442)
> at org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(
> HiveInputFormat.java:561)
> at org.apache.hadoop.hive.ql.exec.tez.HiveSplitGenerator.
> initialize(HiveSplitGenerator.java:196)
> at org.apache.tez.dag.app.dag.RootInputInitializerManager$
> InputInitializerCallable$1.run(RootInputInitializerManager.java:278)
> at org.apache.tez.dag.app.dag.RootInputInitializerManager$
> InputInitializerCallable$1.run(RootInputInitializerManager.java:269)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1807)
> at org.apache.tez.dag.app.dag.RootInputInitializerManager$
> InputInitializerCallable.call(RootInputInitializerManager.java:269)
> at org.apache.tez.dag.app.dag.RootInputInitializerManager$
> InputInitializerCallable.call(RootInputInitializerManager.java:253)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: com.amazonaws.AmazonClientException: No AWS Credentials
> provided by BasicAWSCredentialsProvider EnvironmentVariableCredentialsProvider
> SharedInstanceProfileCredentialsProvider : 
> com.amazonaws.AmazonClientException:
> The requested metadata is not found at http://16x.xxx.xxx.xx4/latest/
> meta-data/iam/security-credentials/
> at org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(
> AWSCredentialProviderList.java:151)
> at com.amazonaws.services.s3.AmazonS3Client.invoke(
> AmazonS3Client.java:3779)
> at com.amazonaws.services.s3.AmazonS3Client.headBucket(
> AmazonS3Client.java:1107)
> at com.amazonaws.services.s3.AmazonS3Client.doesBucketExist(
> AmazonS3Client.java:1070)
> at org.apache.hadoop.fs.s3a.S3AFileSystem.verifyBucketExists(
> S3AFileSystem.java:276)
> ... 24 more
> Caused by: com.amazonaws.AmazonClientException: The requested metadata is
> not found at http://16x.xxx.xxx.xx4/latest/meta-data/iam/security-
> credentials/
> at com.amazonaws.internal.EC2MetadataClient.readResponse(
> EC2MetadataClient.java:111)
> at com.amazonaws.internal.EC2MetadataClient.readResource(
> EC2MetadataClient.java:92)
> at com.amazonaws.internal.EC2MetadataClient.getDefaultCredentials(
> EC2MetadataClient.java:55)
> at com.amazonaws.auth.InstanceProfileCredentialsProvider.loadCredentials(
> InstanceProfileCredentialsProvider.java:186)
> at com.amazonaws.auth.InstanceProfileCredentialsProvider.getCredentials(
> InstanceProfileCredentialsProvider.java:124)
> at org.apache.hadoop.fs.s3a.AWSCredentialProviderList.getCredentials(
> AWSCredentialProviderList.java:129)
> ... 28 more
> ]
>
>
>
>
>
>
>
>
>
> Regards,
> Anup Tiwari
>

Reply via email to