Re: Copying all Hive tables from Prod to UAT

2016-05-26 Thread Mich Talebzadeh
That is a good point Jorn with regard to JDBC and Hive data I believe you can use JDBC to get a compressed data from an Oraclle or Sybase database cause decompression happens at the time of data access much like using a sqlplus or isql tool. However, it is worth trying what happens when one

Re: Copying all Hive tables from Prod to UAT

2016-05-26 Thread Elliot West
Hello, I've been looking at this recently for moving Hive tables from on-premise clusters to the cloud, but the principle should be the same for your use-case. If you wish to do this in an automated way, some tools worth considering are: - Hive's built in replication framework:

Re: Copying all Hive tables from Prod to UAT

2016-05-26 Thread Jörn Franke
Or use Falcon ... The Spark JDBC I would try to avoid. Jdbc is not designed for these big data bulk operations, eg data has to be transferred uncompressed and there is the serialization/deserialization issue query result -> protocol -> Java objects -> writing to specific storage format etc

Re: Copying all Hive tables from Prod to UAT

2016-05-25 Thread Gopal Vijayaraghavan
> We are using HDP. Is there any feature in ambari Apache Falcon handles data lifecycle management, not Ambari. https://falcon.apache.org/0.8/HiveDR.html Cheers, Gopal

Re: Copying all Hive tables from Prod to UAT

2016-05-25 Thread mahender bigdata
Thanks Mich. I will look into it. On 5/25/2016 9:05 AM, Mich Talebzadeh wrote: They are multiple ways of doing this without relying any vendors release. 1) Using hive EXPORT/IMPORT utility EXPORT TABLE table_or_partition TO hdfs_path; IMPORT [[EXTERNAL] TABLE table_or_partition]

Re: Copying all Hive tables from Prod to UAT

2016-05-25 Thread mahender bigdata
We are using HDP. Is there any feature in ambari On 5/25/2016 6:50 AM, Suresh Kumar Sethuramaswamy wrote: Hi If you are using CDH, via CM , Backup->replications you could do inter cluster hive data transfer including metadata Regards Suresh On Wednesday, May 25, 2016, mahender bigdata

Re: Copying all Hive tables from Prod to UAT

2016-05-25 Thread Mich Talebzadeh
They are multiple ways of doing this without relying any vendors release. 1) Using hive EXPORT/IMPORT utility EXPORT TABLE table_or_partition TO hdfs_path; IMPORT [[EXTERNAL] TABLE table_or_partition] FROM hdfs_path [LOCATION [table_location]]; 2) This works for individual tables but you can

Re: Copying all Hive tables from Prod to UAT

2016-05-25 Thread Suresh Kumar Sethuramaswamy
Hi If you are using CDH, via CM , Backup->replications you could do inter cluster hive data transfer including metadata Regards Suresh On Wednesday, May 25, 2016, mahender bigdata wrote: > Any Document on it. > > On 4/8/2016 6:28 PM, Will Du wrote: > > did you

Re: Copying all Hive tables from Prod to UAT

2016-05-25 Thread mahender bigdata
Any Document on it. On 4/8/2016 6:28 PM, Will Du wrote: did you try export and import statement in HQL? On Apr 8, 2016, at 6:24 PM, Ashok Kumar > wrote: Hi, Anyone has suggestions how to create and copy Hive and Spark tables from

Re: Copying all Hive tables from Prod to UAT

2016-04-08 Thread Will Du
did you try export and import statement in HQL? > On Apr 8, 2016, at 6:24 PM, Ashok Kumar wrote: > > Hi, > > Anyone has suggestions how to create and copy Hive and Spark tables from > Production to UAT. > > One way would be to copy table data to external files and then

Copying all Hive tables from Prod to UAT

2016-04-08 Thread Ashok Kumar
Hi, Anyone has suggestions how to create and copy Hive and Spark tables from Production to UAT. One way would be to copy table data to external files and then move the external files to a local target directory and populate the tables in target Hive with data. Is there an easier way of doing