Jarcec - I've attached the task tracker logs for one of the failed map tasks 
below. I took Venkat's suggestion and imported the data into Hive using the 
following command:

sqoop import  --hive-import -m8 --hive-table schema.table --target-dir 
schema.table --table schema.table --connect 
jdbc:oracle:thin:@xxx.xxx.xxx:0000/schema --username someuser --password somepw 
--split-by id_column

Then tried to export from Hive to Oracle using (and it popped the error):

sqoop export  -m8 --table schema.table --export-dir /user/username/file 
--connect jdbc:oracle:thin:@xxx.xxx.xxx.com:0000/schema --username someuser 
--password somepw

I checked the rows Sqoop said had errors but didn't see any anomalies.

Task Tracker logs (I had to strip some of the values and replace with generic 
values, but left the data types/lengths the same):

syslog logs
2013-10-31 11:45:01,054 INFO org.apache.hadoop.util.NativeCodeLoader: Loaded 
the native-hadoop library
2013-10-31 11:45:01,532 INFO org.apache.hadoop.util.ProcessTree: setsid exited 
with exit code 0
2013-10-31 11:45:01,535 INFO org.apache.hadoop.mapred.Task:  Using 
ResourceCalculatorPlugin : 
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@6a0da90c
2013-10-31 11:45:01,671 INFO org.apache.hadoop.mapred.MapTask: Processing 
split: 
Paths:/user/UserName/table_name/part-m-00000:0+67108864,/user/UserName/table_name/part-m-00000:67108864+67108864
2013-10-31 11:45:15,162 INFO com.hadoop.compression.lzo.GPLNativeCodeLoader: 
Loaded native gpl library
2013-10-31 11:45:15,164 INFO com.hadoop.compression.lzo.LzoCodec: Successfully 
loaded & initialized native-lzo library [hadoop-lzo rev 
cf4e7cbf8ed0f0622504d008101c2729dc0c9ff3]
2013-10-31 11:45:15,167 WARN org.apache.hadoop.io.compress.snappy.LoadSnappy: 
Snappy native library is available
2013-10-31 11:45:15,167 INFO org.apache.hadoop.io.compress.snappy.LoadSnappy: 
Snappy native library loaded
2013-10-31 11:45:15,197 ERROR org.apache.sqoop.mapreduce.TextExportMapper: 
2013-10-31 11:45:15,197 ERROR org.apache.sqoop.mapreduce.TextExportMapper: 
Exception raised during data export
2013-10-31 11:45:15,197 ERROR org.apache.sqoop.mapreduce.TextExportMapper: 
2013-10-31 11:45:15,197 ERROR org.apache.sqoop.mapreduce.TextExportMapper: 
Exception: 
java.lang.NumberFormatException
        at java.math.BigDecimal.<init>(BigDecimal.java:459)
        at java.math.BigDecimal.<init>(BigDecimal.java:728)
        at table_name.__loadFromFields(table_name.java:824)
        at table_name.parse(table_name.java:776)
        at 
org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
        at 
org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
        at 
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:363)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)
2013-10-31 11:45:15,199 ERROR org.apache.sqoop.mapreduce.TextExportMapper: On 
input: 1234567XYZE1PC7654321SOME TEXT FIELDnull312 MAIN STANY 
CITYNY67560nullnullYnullnullnullnullnullnullnull7654321Some 
Description2012-05-30 00:00:00.0N
2013-10-31 11:45:15,199 ERROR org.apache.sqoop.mapreduce.TextExportMapper: On 
input file: null
2013-10-31 11:45:15,199 ERROR org.apache.sqoop.mapreduce.TextExportMapper: At 
position 0
2013-10-31 11:45:15,199 ERROR org.apache.sqoop.mapreduce.TextExportMapper: 
2013-10-31 11:45:15,199 ERROR org.apache.sqoop.mapreduce.TextExportMapper: 
Currently processing split:
2013-10-31 11:45:15,199 ERROR org.apache.sqoop.mapreduce.TextExportMapper: 
Paths:/user/UserName/table_name/part-m-00000:0+67108864,/user/UserName/table_name/part-m-00000:67108864+67108864
2013-10-31 11:45:15,199 ERROR org.apache.sqoop.mapreduce.TextExportMapper: 
2013-10-31 11:45:15,199 ERROR org.apache.sqoop.mapreduce.TextExportMapper: This 
issue might not necessarily be caused by current input
2013-10-31 11:45:15,199 ERROR org.apache.sqoop.mapreduce.TextExportMapper: due 
to the batching nature of export.
2013-10-31 11:45:15,199 ERROR org.apache.sqoop.mapreduce.TextExportMapper: 
2013-10-31 11:45:15,199 INFO org.apache.sqoop.mapreduce.AutoProgressMapper: 
Auto-progress thread is finished. keepGoing=false
2013-10-31 11:45:15,205 INFO org.apache.hadoop.mapred.TaskLogsTruncater: 
Initializing logs' truncater with mapRetainSize=-1 and reduceRetainSize=-1
2013-10-31 11:45:15,280 INFO org.apache.hadoop.io.nativeio.NativeIO: 
Initialized cache for UID to User mapping with a cache timeout of 14400 seconds.
2013-10-31 11:45:15,280 INFO org.apache.hadoop.io.nativeio.NativeIO: Got 
UserName mapred for UID 498 from the native implementation
2013-10-31 11:45:15,281 ERROR org.apache.hadoop.security.UserGroupInformation: 
PriviledgedActionException as:UserName cause:java.io.IOException: Can't export 
data, please check task tracker logs
2013-10-31 11:45:15,281 WARN org.apache.hadoop.mapred.Child: Error running child
java.io.IOException: Can't export data, please check task tracker logs
        at 
org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
        at 
org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
        at 
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:363)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.NumberFormatException
        at java.math.BigDecimal.<init>(BigDecimal.java:459)
        at java.math.BigDecimal.<init>(BigDecimal.java:728)
        at table_name.__loadFromFields(table_name.java:824)
        at table_name.parse(table_name.java:776)
        at 
org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
        ... 10 more
2013-10-31 11:45:15,283 INFO org.apache.hadoop.mapred.Task: Runnning cleanup 
for the task

Thanks for the help!

-----Original Message-----
From: Jarek Jarcec Cecho [mailto:[email protected]] 
Sent: Thursday, October 31, 2013 2:08 AM
To: [email protected]
Subject: Re: Sqoop from/to Oracle

> java.io.IOException: Can't export data, please check task tracker 
> logs****

Would you mind also sharing the failed map task log? It will contain additional 
information about the failure, such as exact row that Sqoop can't parse.

Jarcec

On Wed, Oct 30, 2013 at 01:44:32PM -0700, Venkat Ranganathan wrote:
> Do the data in any of the varchar columns have the delimiter character in
> them? That could be tripping the parser.   You can replace or drop the
> delimiter using --hive-delims-replacement or --hive-drop-import-delims 
> options (Yes they can be used independent of Hive import)
> 
> Or you can try using the imports to a hive table that does not use 
> text format and you don't have to modify the data
> 
> Venkat
> 
> 
> On Wed, Oct 30, 2013 at 12:38 PM, Martin, Nick <[email protected]> wrote:
> 
> >  Ok, here’s an update on this. I’m going to provide as much intel as 
> > possible but let me know if folks need more context. Appreciate the 
> > help in
> > advance…****
> >
> > ** **
> >
> > **1.     ***Process I’m testing*: Source table in Oracle 11.2.0.3 > Sqoop
> > Import to HDFS > Sqoop Export from HDFS > Target table in Oracle 
> > 11.2.0.3 (source and target have exact same DDL) (note: Sqoop 
> > version is
> > 1.4.3.1.3.2.0-111)****
> >
> > ** **
> >
> > **2.     ***Source/target table schema*: ****
> >
> > column type****
> >
> > COL1    NUMBER****
> >
> > COL2    VARCHAR2(3 BYTE)****
> >
> > COL3    VARCHAR2(2 BYTE)****
> >
> > COL4    VARCHAR2(2 BYTE)****
> >
> > COL5    NUMBER****
> >
> > COL6    VARCHAR2(60 BYTE)****
> >
> > COL7    VARCHAR2(70 BYTE)****
> >
> > COL8    VARCHAR2(70 BYTE)****
> >
> > COL9    VARCHAR2(40 BYTE)****
> >
> > COL10  VARCHAR2(3 BYTE)****
> >
> > COL11  VARCHAR2(12 BYTE)****
> >
> > COL12  VARCHAR2(30 BYTE)****
> >
> > COL13  DATE****
> >
> > COL14  VARCHAR2(1 BYTE)****
> >
> > COL15  VARCHAR2(70 BYTE)****
> >
> > COL16  VARCHAR2(70 BYTE)****
> >
> > COL17  DATE****
> >
> > COL18  VARCHAR2(30 BYTE)****
> >
> > COL19  DATE****
> >
> > COL20  VARCHAR2(30 BYTE)****
> >
> > COL21  VARCHAR2(3 BYTE)****
> >
> > COL22  NUMBER****
> >
> > COL23  VARCHAR2(30 BYTE)****
> >
> > COL24  DATE****
> >
> > COL25  VARCHAR2(1 BYTE)****
> >
> > ** **
> >
> > **3.     ***Sqoop import syntax*: sqoop import --table  schema.table_name
> > -m 8 --target-dir path/to_dir --connect 
> > jdbc:oracle:thin:@xxx.xxx.com:0000/schema --username xxx --password 
> > xxx***
> > *
> >
> > **­    **Import doesn’t fail; record count imported into HDFS matches the
> > rowcount on the source table****
> >
> > **4.     ***Sqoop export syntax*: sqoop export --connect
> > jdbc:oracle:thin:@xxx.xxx.xxx.com:0000/schema --table 
> > schema.table_name --export-dir /path/to_dir --username xxx 
> > --password xxx****
> >
> > **­    **Export fails with:****
> >
> > java.io.IOException: Can't export data, please check task tracker 
> > logs****
> >
> >         at
> > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.jav
> > a:112)
> > ****
> >
> >         at
> > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.jav
> > a:39)*
> > ***
> >
> >         at 
> > org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)****
> >
> >         at
> > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper
> > .java:64)
> > ****
> >
> >         at 
> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763)
> > ****
> >
> >         at 
> > org.apache.hadoop.mapred.MapTask.run(MapTask.java:363)****
> >
> >         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)****
> >
> >         at java.security.AccessController.doPrivileged(Native 
> > Method)****
> >
> >         at javax.security.auth.Subject.doAs(Subject.java:396)****
> >
> >         at
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInform
> > ation.java:1232)
> > ****
> >
> >         at org.apache.hadoop.mapred.Child.main(Child.java:249)****
> >
> > Caused by: java.lang.IllegalArgumentException: Timestamp format must 
> > be yyyy-mm-dd hh:mm:ss[.fffffffff]****
> >
> >         at java.sql.Timestamp.valueOf(Timestamp.java:194)****
> >
> >         at tableinfo.__loadFromFields(tableinfo.java:939)****
> >
> >         at tableinfo.parse(schema_tableinfo.java:776)****
> >
> >         at
> > org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.jav
> > a:83)*
> > ***
> >
> >         ... 10 more****
> >
> > * *
> >
> > *Venkat* – I tried your suggestion below and the same failure 
> > happened (although I checked in HDFS and it did indeed bring the 
> > DATE in without flipping to TIMESTAMP). ****
> >
> > ** **
> >
> > Any ideas? Need any other info?****
> >
> > ** **
> >
> > ** **
> >
> > *From:* Venkat Ranganathan [mailto:[email protected]]
> > *Sent:* Wednesday, October 30, 2013 12:55 AM
> >
> > *To:* [email protected]
> > *Subject:* Re: Sqoop from/to Oracle****
> >
> > ** **
> >
> > One thing that you may want to do (depending on the version of 
> > Oracle you are using) is to setup a connection parameter file 
> > (please see --connection-param-file option) and have one Oracle JDBC 
> > connection parameter set to not convert Data to timestamps****
> >
> > ** **
> >
> > oracle.jdbc.mapDateToTimestamp=false****
> >
> > ** **
> >
> > ** **
> >
> > On Tue, Oct 29, 2013 at 5:32 PM, Martin, Nick <[email protected]> 
> > wrote:**
> > **
> >
> > Weird…let me re-test and if it fails again I’ll include some sample 
> > data and the error output. ****
> >
> >  ****
> >
> > Thanks for the help Abe!****
> >
> >  ****
> >
> > *From:* Abraham Elmahrek [mailto:[email protected]]
> > *Sent:* Tuesday, October 29, 2013 7:06 PM****
> >
> >
> > *To:* [email protected]
> > *Subject:* Re: Sqoop from/to Oracle****
> >
> >  ****
> >
> > Nick,****
> >
> >  ****
> >
> > I haven't tested this, but Sqoop should accept full timestamps when 
> > exporting into DATE columns. If the data you're exporting has a full 
> > timestamp (as it should after import), then the export job should just work.
> > ****
> >
> >  ****
> >
> > -Abe****
> >
> >  ****
> >
> > On Mon, Oct 28, 2013 at 9:55 AM, Martin, Nick <[email protected]> 
> > wrote:**
> > **
> >
> > Hi Abe,****
> >
> >  ****
> >
> > After doing some digging on this issue I’ve found I’m facing what 
> > seems like a fairly common issue with importing data from an Oracle 
> > DATE column and trying to export that same data (from HDFS) back 
> > into an Oracle DATE column (of course, it will have been converted 
> > to a timestamp on the way into HDFS). ****
> >
> >  ****
> >
> > So, what’s the current thinking on the best way to get around this 
> > issue?*
> > ***
> >
> >  ****
> >
> > Thanks,****
> >
> > Nick  ****
> >
> >  ****
> >
> >  ****
> >
> >  ****
> >
> > *From:* Abraham Elmahrek [mailto:[email protected]]
> > *Sent:* Tuesday, October 01, 2013 7:31 PM****
> >
> >
> > *To:* [email protected]
> > *Subject:* Re: Sqoop from/to Oracle****
> >
> >  ****
> >
> > Nick,****
> >
> >  ****
> >
> > It looks like Sqoop believes the format of the timestamps are incorrect.
> > Could you please inspect the data you are attempting to import and 
> > verify the columns are aligned and the data is correctly formatted? 
> > I believe if you use the --verbose option, Sqoop will give more 
> > details in its own logs and the tasks logs. Can you post the task 
> > logs back here?****
> >
> >  ****
> >
> > Some other information that might be helpful to us is an example of 
> > the data you're trying to export and the schema of the table you're 
> > importing
> > to.****
> >
> >  ****
> >
> > -Abe****
> >
> >  ****
> >
> > On Tue, Oct 1, 2013 at 4:11 PM, Martin, Nick <[email protected]> 
> > wrote:***
> > *
> >
> > Hi Abe,****
> >
> >  ****
> >
> > Just checking in to see if you had any suggestions for me to 
> > try?****
> >
> >  ****
> >
> > Thanks again,****
> >
> > Nick
> >
> > Sent from my iPhone****
> >
> >
> > On Sep 30, 2013, at 6:36 PM, "Abraham Elmahrek" <[email protected]> 
> > wrote:*
> > ***
> >
> >  Nick, ****
> >
> >  ****
> >
> > What is the exact command you are using and the exact error you are 
> > seeing? Also, what version of sqoop are you using?****
> >
> >  ****
> >
> > -Abe****
> >
> >  ****
> >
> > On Mon, Sep 30, 2013 at 3:32 PM, Martin, Nick <[email protected]> 
> > wrote:**
> > **
> >
> > Hi all,
> >
> > I have a table I've imported from Oracle into HDFS and now I want to 
> > export it into an Oracle Db (import and export table schemas are identical).
> >
> > My initial attempt at exporting from HDFS (using Sqoop) to Oracle 
> > failed (data types failures). So, my question is what's the best way 
> > for me to accomplish this? My source table is a mixture of NUMBER, 
> > VARCHAR, DATE fields. Am I better off importing into a Hive table I 
> > define and then exporting into Oracle from there? Or do I just need 
> > to do some export formatting in my Sqoop export statement?
> >
> > Any other intel you'd need to make a recommendation?
> >
> > Thanks in advance,
> > Nick
> >
> > Sent from my iPhone****
> >
> >  ****
> >
> >    ****
> >
> >  ****
> >
> > ** **
> >
> >
> > CONFIDENTIALITY NOTICE
> > NOTICE: This message is intended for the use of the individual or 
> > entity to which it is addressed and may contain information that is 
> > confidential, privileged and exempt from disclosure under applicable 
> > law. If the reader of this message is not the intended recipient, 
> > you are hereby notified that any printing, copying, dissemination, 
> > distribution, disclosure or forwarding of this communication is 
> > strictly prohibited. If you have received this communication in 
> > error, please contact the sender immediately and delete it from your 
> > system. Thank You.****
> >
> >
> 
> --
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or 
> entity to which it is addressed and may contain information that is 
> confidential, privileged and exempt from disclosure under applicable 
> law. If the reader of this message is not the intended recipient, you 
> are hereby notified that any printing, copying, dissemination, 
> distribution, disclosure or forwarding of this communication is 
> strictly prohibited. If you have received this communication in error, 
> please contact the sender immediately and delete it from your system. Thank 
> You.

Reply via email to