Yes, I did. I have also tested my parquet file which reside in the
encryption zone and can be read with hive and parquet tool.

- kidong

2016년 6월 28일 화요일, Parth Chandra<pchan...@maprtech.com>님이 작성한 메시지:

> Hi Kidong,
>
>   I haven't tried this myself, but my guess is that the KMS settings need
> to be provided at the HDFS layer not in the drill storage plugin.
>
>   Specify hadoop.security.key.provider.path in core-site
>
>   Specify dfs.encryption.key.provider.uri  in hdfs-site
>
>   Or did you already do that?
>
> Parth
>
>
> On Mon, Jun 27, 2016 at 1:11 AM, Kidong Lee <mykid...@gmail.com
> <javascript:;>> wrote:
>
> > Hi,
> >
> > I got some problem using drill with HDFS Encryption.
> >
> > With Hive, DFS Storage, I got the errors like this:
> > Error: SYSTEM ERROR: IOException: No KeyProvider is configured, cannot
> > access an encrypted file
> >
> > Even if I have added some confs below to drill storage plugin, the result
> > is the same:
> >
> > in dfs storage:
> > "config": {
> >     "hadoop.security.key.provider.path": "kms://h...@xxxx.com
> <javascript:;>;
> > xxxx.com:16000/kms",
> >     "dfs.encryption.key.provider.uri": "kms://h...@xxxx.com
> <javascript:;>;
> > xxxx.com:16000/kms"
> >   }
> >
> > in hive storage:
> > "configProps": {
> >   ...
> >     "hadoop.security.key.provider.path": "kms://h...@xxxx.com
> <javascript:;>;
> > xxxx.com:16000/kms",
> >     "dfs.encryption.key.provider.uri": "kms://h...@xxxx.com
> <javascript:;>;
> > xxxx.com:16000/kms"
> >   }
> >
> > I have tested with hive for the tables of the encrypted files on hdfs, it
> > works fine.
> >
> > Any idea.
> >
> > - Kidong Lee.
> >
>

Reply via email to