I am using sqoop-1.4.4 and import by mysql,Why I get the error:unrecognized 
parameter: merge-key ? 


At 2015-09-04 17:13:52, [email protected] wrote:


Thank you Abe,

 

Yes my data has a terminal character ie., the same character that I have used 
as sqoop delimiter.

 

 

Regards

Suman

 

From: Abraham Elmahrek [mailto:[email protected]]
Sent: 04 September 2015 00:47
To:[email protected]
Subject: Re: Sqoop Merge failed

 

Hello,

 

It seems like there's a parsing error of some kind. Maybe there's a terminal 
character in your data somewhere. Could you re-run your job with --verbose and 
look at the task logs? There should be logs of the data being transferred. We 
can inspect what's going wrong in the records themselves.

 

-Abe

 

On Sun, Aug 30, 2015 at 11:23 PM <[email protected]> wrote:

Hi Abe,

 

PFB create statement of that table in Oracle

 

createtableXXX_XXX_XXXXXXXX
(
  XXX_XXXX_IDENTITY_ID      NUMBERnotnull,
  .....


  PRF_DOMAIN_ID             NUMBERnotnull
)

 

 

Thanks and Regards

Suman Chaitanya Dasari

 

From: Abraham Elmahrek [mailto:[email protected]]
Sent: 29 August 2015 03:40
To:[email protected]
Subject: Re: Sqoop Merge failed

 

Hey there,

 

Could you do a "DESCRIBE PRF_USER_IDENTITY" and paste the output here? Sqoop 
seems to think you're importing a timestamp.

 

Also, --verbose should help you identify problems in general.

 

-Abe

 

On Thu, Aug 27, 2015 at 11:10 PM <[email protected]> wrote:

Hello Everyone,

 

I am trying to push incremental updates from Oracle to hdfs using sqoop import 
command with Merge-key option and incremental mode as "lastmodified".

I have pasted below, my sqoop command and a part of repetitive exception from 
my logs.

 

--target-dir /user/$USER/PRF_USER_IDENTITY is the same location where I have 
data to be merged with the latest incremental data.

Can't parse input data: 'BTCOM_MIGR_C'. This value exists in a column with 
VARCHAR2(20) NOT NULL

 

I have framed the below command based on below link

https://mail-archives.apache.org/mod_mbox/sqoop-user/201505.mbox/%3CCAMZRiGzDe9sN8+Tb_MWtLcmb8+Q=agturna4xkq7skykzu+...@mail.gmail.com%3E

 

 

 

sqoop job --create prf_user_identity_delta -- import --connect 
jdbc:oracle:thin:@//xxx.xx.xx.xx:xxxx/xxxxxx --username profileuserr6_0 
--password-file /user/$USER/XXX/sqoop.password --table PRF_USER_IDENTITY 
--incremental lastmodified --check-column LAST_UPDATED_DTIME --last-value 
"2015-08-21 14:42:42" --merge-key PRF_USER_IDENTITY_ID --null-string '\\N' 
--null-non-string '\\N' --fields-terminated-by , --input-null-string '\\N' 
--input-null-non-string '\\N' --input-fields-terminated-by , --target-dir 
/user/$USER/PRF_USER_IDENTITY -m 1

 

 

15/08/27 12:46:40 INFO mapreduce.Job: Task Id : 
attempt_1436876041180_239087_m_000078_2, Status : FAILED

Error: java.lang.RuntimeException: Can't parse input data: 'BTCOM_MIGR_C'

        at PRF_USER_IDENTITY.__loadFromFields(PRF_USER_IDENTITY.java:1164)

        at PRF_USER_IDENTITY.parse(PRF_USER_IDENTITY.java:1002)

        at 
org.apache.sqoop.mapreduce.MergeTextMapper.map(MergeTextMapper.java:53)

        at 
org.apache.sqoop.mapreduce.MergeTextMapper.map(MergeTextMapper.java:34)

        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)

        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:784)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)

        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:415)

        at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)

        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)

Caused by: java.lang.IllegalArgumentException: Timestamp format must be 
yyyy-mm-dd hh:mm:ss[.fffffffff]

        at java.sql.Timestamp.valueOf(Timestamp.java:202)

        at PRF_USER_IDENTITY.__loadFromFields(PRF_USER_IDENTITY.java:1081)

        ... 11 more

 

 

Please help me…I am struggling with this error from past couple of days.

 

 

Thanks

Suman

 

Reply via email to