[ 
https://issues.apache.org/jira/browse/SQOOP-2800?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15112063#comment-15112063
 ] 

prem commented on SQOOP-2800:
-----------------------------

16/01/22 12:36:04 INFO mapreduce.Job: Task Id : 
attempt_1452013806070_8775_m_000002_0, Status : FAILED
Error: com.teradata.connector.common.exception.ConnectorException: index outof 
boundary
        at 
com.teradata.connector.teradata.converter.TeradataConverter.convert(TeradataConverter.java:145)
        at 
com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:106)
        at 
com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.write(ConnectorOutputFormat.java:65)
        at 
org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:658)
        at 
org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
        at 
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
        at 
com.teradata.connector.common.ConnectorMMapper.map(ConnectorMMapper.java:129)
        at 
com.teradata.connector.common.ConnectorMMapper.run(ConnectorMMapper.java:117)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)


When i am removing option -- --default-character-set=UNICODE it work fine 

Please suggest solution as soon as possible

> Not able to trnasfer unicode data to Teradata
> ---------------------------------------------
>
>                 Key: SQOOP-2800
>                 URL: https://issues.apache.org/jira/browse/SQOOP-2800
>             Project: Sqoop
>          Issue Type: Bug
>          Components: sqoop2-api
>    Affects Versions: 1.4.3
>         Environment: UAT
>            Reporter: prem
>             Fix For: 1.99.7, 1.4.7
>
>   Original Estimate: 96h
>  Remaining Estimate: 96h
>
> sqoop export --connect "$!/Database=**" --username ** --password ** 
> --input-null-string '\\N' --input-null-non-string '\\N' --export-dir /HDFS 
> PATH --table S_CUST -- --default-character-set=UTF8 --num-mappers 1 
> --input-fields-terminated-by '\001'
> for one column it work fine and when i have multiple columns hdfs it is not 
> able to transfer file :



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to