exporting partitioned data into remote database

2010-06-18 Thread Szymon Gwóźdź
Hi! I have table tb1 defined by: CREATE TABLE tb1(user int, counter int) PARTITIONED BY (day string) STORED AS TEXTFILE I want to export data from this table into mysql table defined by: CREATE TABLE tb2(user int, counter int, day string) I've tried to use Sqoop in order to do this but Sqoop

Re: alter table add partition error

2010-06-18 Thread Edward Capriolo
On Fri, Jun 18, 2010 at 1:49 PM, Ning Zhang nzh...@facebook.com wrote: Pradeep, I ran the commands you provided and it succeeded with the expected behavior. One possibility is that there are multiple versions of libthrift.jar in your CLASSPATH (hadoop hive). Can you check in the Hadoop

hive.aux.jars.path not used - get a SerDe does not exist error

2010-06-18 Thread Karthik
I have my custom SerDe classes (as jar files) under /home/hadoop/hive/lib folder and I have set hive.aux.jars.path property in my hive-site.xml file to this location (value: file:///home/hadoop/hive/lib/) When I create a able (or query an existing table), I get a SerDe does not exist error,

Re: exporting partitioned data into remote database

2010-06-18 Thread Aaron Kimball
Hi Szymon, Unfortunately Sqoop can't yet read partition column information out of the Hive table. So you'll need to export each partition individually. You can probably get your system to work right by doing these two commands: sqoop --connect jdbc:mysql://test-db.gadu/crunchers --username