Sqoop_Sql_blob_types

2016-04-27 Thread Ajay Chander
Hi Everyone, I have a table which has few columns as blob types with huge data. Is there any best way to 'sqoop import' it to hive tables with out losing any data ? Any help is highly appreciated. Thank you!

Re: Sqoop_Sql_blob_types

2016-04-27 Thread Mich Talebzadeh
Is the source of data Oracle? Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw * http://talebzadehmich.wordpress.com On 27 April 2016 at 21:13,

Re: Sqoop_Sql_blob_types

2016-04-27 Thread Ajay Chander
Mich thanks for looking into this. At this point of time the source is MySQL. Thank you! On Wednesday, April 27, 2016, Mich Talebzadeh wrote: > Is the source of data Oracle? > > Dr Mich Talebzadeh > > > > LinkedIn * > https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUr

Re: Sqoop_Sql_blob_types

2016-04-27 Thread Jörn Franke
You could try as binary. Is it just for storing the blobs or for doing analyzes on them? In the first case you may think about storing them as files in HDFS and including in hive just a string containing the file name (to make analysis on the other data faster). In the later case you should thin

Re: Sqoop_Sql_blob_types

2016-04-27 Thread Ajay Chander
Thanks Franke! Probably now I don't want to move the data directly into Hive. My SQL database contains a table 'test' with 2 Columns(file_name char(100) ,file_data longblob). Column 'file_data' may contain xml formatted data or pipe delimited data and its huge amount of data. Right now I am conside