Hi Team,
I have a column in hive table whose value is of datatype string and has
characters more than 4000.
I want to export this hive table to Oracle DB and the possible column
datatype i can give in DB is varchar2(4000) or CLOB. Clearly using
varchar2(4000) is throwing an error out and so i could only use CLoB on the
DB side.
When i use CLOB on DB the sqoop export is throwing an error out.
Caused by: java.io.IOException: Could not buffer record
at
org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:218)
at
org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:46)
at
org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:591)
at
org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:85)
at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:106)
at
org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:84)
... 10 more
Caused by: java.lang.CloneNotSupportedException:
com.cloudera.sqoop.lib.ClobRef
at java.lang.Object.clone(Native Method)
at org.apache.sqoop.lib.LobRef.clone(LobRef.java:109)
So am i missing out something before doing the export , like to set any
other property on the sqoop end?