Yes I can get the logs but, first I am going to have to mock it up in my lab 
with some dummy data and credentials.  I should be able to provide full logs 
tomorrow.

My darn signature leaked out on my last reply. If anybody can scrub my last 
post and remove my signature that would be awesome.

Thanks,
-Eric


On Jul 24, 2013, at 3:51 PM, Abraham Elmahrek 
<[email protected]<mailto:[email protected]>> wrote:

Eric,

The middle command seems right. Could you provide the rest of your logs? It 
will help us understand where in the process sqoop fails.

-Abe



I have tried many different variations all with the same result

sqoop export --connect 'jdbc:mysql://mysqlIP:3306/hadoop' --username=hadoop 
--password='sanitized' --table=tableA --export-dir /hive/tableA -m 1 
--fields-terminated-by '\001'

sqoop export --connect 'jdbc:mysql://mysqlIP:3306/hadoop' --username=hadoop 
--password='sanitized' --table=tableA --export-dir /hive/tableA -m 1 
--input-fields-terminated-by  '\001'

sqoop export --connect 'jdbc:mysql://mysqlIP:3306/hadoop' --username=hadoop 
--password='sanitized' --table=tableA --export-dir /hive/tableA -m 1





Hey Eric,

I believe its possible. Can you provide the command you are using?

-Abe


On Wed, Jul 24, 2013 at 2:54 PM, Eric Hernandez  wrote:
Hi,
Is it possible to sqoop data out of hive back into an RDBMS like MyQL or SQL 
Server when it has been imported via sqoop as a sequence file?

I have been trying all day to get data back out of hive and I keep getting this 
error no matter what I try

"java.lang.ClassCastException: org.apache.hadoop.io.BytesWritable cannot be 
cast to org.apache.hadoop.io.LongWritable"

I am using Sqoop 1.4.1-cdh4.1.2

Thanks,

Eric H



Reply via email to