Hi Abe, Thanks for highlighting missing required info quickly. Below are the details:
- *Version:* Sqoop 1.4.5 - *Sqoop Command: *sqoop import --connect jdbc:teradata://aa.bb.cc.internal/DATABASE=someDB --username sqoop_usr --password sqoop_usr --table ENCRYPTED_TBL --fields-terminated-by \\001 -m 1 --target-dir /tmp/ENC_TBL --connection-manager "org.apache.sqoop.manager.GenericJdbcManager" --driver com.teradata.jdbc.TeraDriver - *Table structure:* id:varchar, count:int, first_name:binary, email:binary, column5:varchar. Binary is used as the data is encrypted. Thanks! On Wed, Jul 15, 2015 at 6:44 PM, Abraham Elmahrek <[email protected]> wrote: > Hey man, > > Need some details to help: > > - What version of Sqoop? > - Sqoop command? > - Database table structure (preferably a describe on the database) > > -Abe > > On Wed, Jul 15, 2015 at 6:42 PM, Suraj Nayak <[email protected]> wrote: > > > Hi Sqoop Users and Developers, > > > > How can i import a Binary data column in a table into HDFS without > > converting it into String. > > > > I have encrypted data in RDBMS, I need to import this column as is > without > > converting it into string. As of now, Sqoop is typecasting the data into > > String/text and decryption is failing in Hadoop. > > > > Can someone provide pointers to solve this? Any workaround? > > > > -- > > Thanks > > Suraj Nayak M > > > -- Thanks Suraj Nayak M
