Yeah, we only support exporting Parquet files when they were created with the Kite data set :(
On Tue, Mar 17, 2015 at 10:08 AM, Suresh Kumar Sethuramaswamy <[email protected]> wrote: > Gwen, > > It is a parquet format file. > > Regards > Suresh > > > On Tue, Mar 17, 2015 at 11:00 AM, Gwen Shapira <[email protected]> > wrote: >> >> It looks like for some reason Sqoop is trying to export your partition >> as if it was a Kite data set. >> >> What's the file format of the table? (i.e. Avro? Parquet? Text?) >> >> On Tue, Mar 17, 2015 at 7:42 AM, Suresh Kumar Sethuramaswamy >> <[email protected]> wrote: >> > >> > >> > Hi, >> > >> > >> > I have a partitioned hive table which i want to export to an ORacle >> > table. >> > >> > >> > >> > Sqoop statement >> > >> > ---------------- >> > >> > sqoop export -D >> > mapred.child.java.opts="-Djava.security.egd=file:/dev/../dev/urandom" >> > --connect jdbc:oracle:thin:@//<host:Port>/<DBNAME> --username <user> >> > --password <password> --table <oracletable> --export-dir >> > /user/hive/warehouse/<db>/<hivetable>/<partition1>/<partition2> >> > --enclosed-by '\"' >> > >> > >> > >> > >> > >> > Env: >> > >> > ----- >> > >> > CDH 5.3.0 >> > >> > Sqoop 1.4.5 >> > >> > Hive 0.13 >> > >> > >> > >> > >> > >> > Error: >> > >> > ---- >> > >> > org.kitesdk.data.DatasetNotFoundException: Descriptor location does not >> > exist: >> > >> > hdfs://<namenode>:8020/user/hive/warehouse/<db>/<hivetable>/<partition1>/<partition2>/.metadata >> > >> > >> > >> > Please help resolve this or suggest a better option to export a >> > partitioned >> > hive table data to Oracle table. >> > >> > >> > >> > Regards >> > >> > Suresh >> > >> > > >
