Another option is to create a job that casts the blob values to strings
using a free-form query: sqoop ... --query "SELECT id, CAST(NAME as
CHAR(10000) CHARACTER SET UTF8) FROM mi_blob_test $CONDITIONS" ....

-Abe

On Mon, Jul 6, 2015 at 9:43 PM, Manikandan R <[email protected]> wrote:

> Yes, it is also doing the same.
>
> On Tue, Jul 7, 2015 at 6:21 AM, Abraham Elmahrek <[email protected]> wrote:
>
>> Have you tried "--map-column-java"?
>> http://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html#_controlling_type_mapping
>> .
>>
>> On Mon, Jul 6, 2015 at 5:16 AM, Manikandan R <[email protected]>
>> wrote:
>>
>>> I am using
>>>
>>> select unhex(regexp_replace(name, " ", "")) from
>>> tempdb.stg_mi_blob_test4;
>>>
>>> and it works.
>>>
>>> On Mon, Jul 6, 2015 at 3:34 PM, Manikandan R <[email protected]>
>>> wrote:
>>>
>>>> I am using map column hive option to convert cols of type blob to
>>>> string as part of import.
>>>>
>>>> for ex,
>>>>
>>>> On mysql side,
>>>>
>>>> mysql> select * from mi_blob_test;
>>>> +------+-------+
>>>> | id   | name  |
>>>> +------+-------+
>>>> |    1 | Mani1 |
>>>> +------+-------+
>>>> 1 row in set (0.00 sec)
>>>>
>>>>
>>>> import --connect jdbc:mysql://test.db.gwynniebee.com:3306/tempdb
>>>> --username root --password-file /tmp/.password --table mi_blob_test
>>>> --hive-import --hive-table tempdb.stg_mi_blob_test1 --hive-overwrite
>>>> --target-dir /data/tempdb/stg_mi_blob_test1 --null-string '\\N'
>>>> --null-non-string '\\N' --map-column-hive  name=string
>>>>
>>>> command is running fine.
>>>>
>>>> But when i tried to read the data using hive or impala shell, I am
>>>> seeing this
>>>>
>>>> select * from stg_mi_blob_test4;
>>>> Query: select * from stg_mi_blob_test4
>>>> +----+----------------+
>>>> | id | name           |
>>>> +----+----------------+
>>>> | 1  | 4d 61 6e 69 31 |
>>>> +----+----------------+
>>>> Returned 1 row(s) in 0.75s
>>>>
>>>>
>>>> Thanks,
>>>> Mani
>>>>
>>>
>>>
>>
>

Reply via email to