I'm encountering an issue when exporting data from HDFS to Oracle. I'm not sure 
whether this is an existing issue or not. When I carefully checked the log, I 
saw one of the column has "2/13/2013 4:35:50" wheres the corresponding column 
type in Oracle is varchar2.

Can't we export date or time just as string? When my HDFS has that kind of 
data, Sqoop halts and at the end its(task) killed by mapreduce task's timeout.
Based on this: 
http://qnalist.com/questions/31561/sqoop-modifies-the-date-format-in-the-exported-data,
 I'm getting impression that Oracle driver converts it to timestamp and tries 
to insert it and when it doesn't see timestamp as column type it halts. Please 
correct me if I'm wrong.
Is there any way not to do that kind of conversion? and just export them as 
string?


Thanks in advance.

Sincerely,Tanzir                                          

Reply via email to