Hi Abraham,
                     Thanks !! I have done the same I  have exported the data 
from hbase to hdfs and exported file is placed at /user/hduser/esr_data still I 
am getting this exception . Please let me know what is wrong below .

One thing I could observe is exported file from Hbase seems to have serialized 
objects instead of  tsv data . But dont know how to get .tsv format through 
hbase export .

Thanks and regards,
Vaibhav Nirkhe
________________________________
From: Abraham Elmahrek [[email protected]]
Sent: Wednesday, October 09, 2013 10:34 PM
To: [email protected]
Subject: Re: Issue in Sqoop Export from HDFS(Hbase data) to MySql

User,

Hbase exporting is currently not supported in Sqoop.

What you can do is export the Hbase data into HDFS first, then use Sqoop to 
transfer it into MySQL.

-Abe


On Wed, Oct 9, 2013 at 5:49 AM, Vaibhav V Nirkhe 
<[email protected]<mailto:[email protected]>> wrote:
Hi ,
       I am using Sqoop 1.4.3 on Hadoop 1.2.1 and trying to export HBase data 
placed in HDFS to MySQL , however I am getting following ClassCastException :-

I am using following command :-

sqoop export --connect jdbc:mysql://localhost:3306/OMS --username root -P 
--table CNT_REPORT_DATA --columns CUSTOMER_ID,MONTH  --export-dir 
/user/hduser/esr_data --verbose -m 1

I guess Sqoop is trying to fetch the record by its key and not able to cast the 
key :-

java.lang.ClassCastException: org.apache.hadoop.hbase.io.ImmutableBytesWritable 
cannot be cast to org.apache.hadoop.io.LongWritable
    at 
org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:95)
    at 
org.apache.sqoop.mapreduce.CombineShimRecordReader.getCurrentKey(CombineShimRecordReader.java:38)
    at 
org.apache.sqoop.mapreduce.CombineFileRecordReader.getCurrentKey(CombineFileRecordReader.java:79)
    at 
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.getCurrentKey(MapTask.java:503)
    at org.apache.hadoop.mapreduce.MapContext.getCurrentKey(MapContext.java:57)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
    at 
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
    at org.apache.hadoop.mapred.Child.main(Child.java:249)


I don't understand why the key is always expected to be LongWritable here ?  
Please suggest asap .



Thanks in advance,



________________________________






NOTE: This message may contain information that is confidential, proprietary, 
privileged or otherwise protected by law. The message is intended solely for 
the named addressee. If received in error, please destroy and notify the 
sender. Any use of this email is prohibited when received in error. Impetus 
does not represent, warrant and/or guarantee, that the integrity of this 
communication has been maintained nor that the communication is free of errors, 
virus, interception or interference.


________________________________






NOTE: This message may contain information that is confidential, proprietary, 
privileged or otherwise protected by law. The message is intended solely for 
the named addressee. If received in error, please destroy and notify the 
sender. Any use of this email is prohibited when received in error. Impetus 
does not represent, warrant and/or guarantee, that the integrity of this 
communication has been maintained nor that the communication is free of errors, 
virus, interception or interference.

Reply via email to