[ 
https://issues.apache.org/jira/browse/HIVE-9991?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sergio Peña updated HIVE-9991:
------------------------------
    Attachment: HIVE-9991.3.patch

Fixed patch to include the test into TestEncryptedHdfsCliDriver tests. 

It also uses pfile:/// as a location instead of file:/// in order to avoid 
change ownership errors.

> Cannot do a SELECT on external tables that are on S3 due to Encryption error
> ----------------------------------------------------------------------------
>
>                 Key: HIVE-9991
>                 URL: https://issues.apache.org/jira/browse/HIVE-9991
>             Project: Hive
>          Issue Type: Bug
>    Affects Versions: 1.0.0
>            Reporter: Sergio Peña
>            Assignee: Sergio Peña
>         Attachments: HIVE-9991.1.patch, HIVE-9991.2.patch, HIVE-9991.3.patch
>
>
> I cannot do any select query on external tables that are not part of HDFS. 
> For example S3.
> {code}
> Select * from my_table limit 10;
> FAILED: SemanticException Unable to determine if s3n://my-bucket/is 
> encrypted: java.lang.IllegalArgumentException: Wrong FS: s3n://my-bucket/, 
> expected: hdfs://0.0.0.0:8020
> {code}
> This error is due to a internal function that checks if a table is encrypted 
> or not. This is only supported on HDFS files, but the check is happening on 
> any external table as well causing the above error.
> To fix this, we should check for encrypted tables only for HDFS tables. And 
> skip the check for any other file schema.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to