Hi neo0731,
depending on what you mean by encryption, it can mean Hadoop at-rest
encryption, Hadoop data transport encryption, Hadoop RPC necryption, HBase
data transport encryption, Hbase RPC encryption, SSL encryption (Hadoop
WebHDFS, HBase Thrift)

distcp supports Hadoop at-rest encryption, and it supports Hadoop data
transport encryption/Hadoop RPC encryption. There's a proposal to support
webhdfs styled URL in distcp, and that way it will be able to support SSL
encrypted traffic.

HBase replication supports Hadoop at-rest encryption. We've seen the use
case in production.
I am aware of some bugs in HBase backup tool that can't read Hadoop
encryption zone files.

On Tue, Oct 9, 2018 at 5:28 AM neo0731 <vva.bag...@gmail.com> wrote:

>
> Question arises when migrating the data from one hbase table to another.
>
> Input
>
> To sync the production cluster data with dev cluster. Additionaly, while
> copying we need to re-hash the following fields: hashed_email, lexer_id,
> foo_imsi, foo_msn, signal_uid, bar_imsi.
>
> Question is : Does copyTable support hashing of data while copying? Same
> for
> distcp utility ? Is it possible to supply some example code in scala as
> well
>
> Any help on it would be much appreciated?
>
>
>
> --
> Sent from:
> http://apache-hbase.679495.n3.nabble.com/HBase-Developer-f679493.html
>


-- 
A very happy Clouderan

Reply via email to