That is a good point Jorn with regard to JDBC and Hive data
I believe you can use JDBC to get a compressed data from an Oraclle or
Sybase database cause decompression happens at the time of data access much
like using a sqlplus or isql tool.
However, it is worth trying what happens when one
Hello,
I've been looking at this recently for moving Hive tables from on-premise
clusters to the cloud, but the principle should be the same for your
use-case. If you wish to do this in an automated way, some tools worth
considering are:
- Hive's built in replication framework:
Or use Falcon ...
The Spark JDBC I would try to avoid. Jdbc is not designed for these big data
bulk operations, eg data has to be transferred uncompressed and there is the
serialization/deserialization issue query result -> protocol -> Java objects ->
writing to specific storage format etc
> We are using HDP. Is there any feature in ambari
Apache Falcon handles data lifecycle management, not Ambari.
https://falcon.apache.org/0.8/HiveDR.html
Cheers,
Gopal
Thanks Mich. I will look into it.
On 5/25/2016 9:05 AM, Mich Talebzadeh wrote:
They are multiple ways of doing this without relying any vendors release.
1) Using hive EXPORT/IMPORT utility
EXPORT TABLE table_or_partition TO hdfs_path;
IMPORT [[EXTERNAL] TABLE table_or_partition]
We are using HDP. Is there any feature in ambari
On 5/25/2016 6:50 AM, Suresh Kumar Sethuramaswamy wrote:
Hi
If you are using CDH, via CM , Backup->replications you could do
inter cluster hive data transfer including metadata
Regards
Suresh
On Wednesday, May 25, 2016, mahender bigdata
They are multiple ways of doing this without relying any vendors release.
1) Using hive EXPORT/IMPORT utility
EXPORT TABLE table_or_partition TO hdfs_path;
IMPORT [[EXTERNAL] TABLE table_or_partition] FROM hdfs_path [LOCATION
[table_location]];
2) This works for individual tables but you can
Hi
If you are using CDH, via CM , Backup->replications you could do inter
cluster hive data transfer including metadata
Regards
Suresh
On Wednesday, May 25, 2016, mahender bigdata
wrote:
> Any Document on it.
>
> On 4/8/2016 6:28 PM, Will Du wrote:
>
> did you
Any Document on it.
On 4/8/2016 6:28 PM, Will Du wrote:
did you try export and import statement in HQL?
On Apr 8, 2016, at 6:24 PM, Ashok Kumar > wrote:
Hi,
Anyone has suggestions how to create and copy Hive and Spark tables
from
did you try export and import statement in HQL?
> On Apr 8, 2016, at 6:24 PM, Ashok Kumar wrote:
>
> Hi,
>
> Anyone has suggestions how to create and copy Hive and Spark tables from
> Production to UAT.
>
> One way would be to copy table data to external files and then
Hi,
Anyone has suggestions how to create and copy Hive and Spark tables from
Production to UAT.
One way would be to copy table data to external files and then move the
external files to a local target directory and populate the tables in target
Hive with data.
Is there an easier way of doing
11 matches
Mail list logo