Yes, the table you are importing data into has to exist. Sqoop export simply does batched "insert into" into that table. Since the data in Hadoop is often just a text file, there is not enough information for Sqoop to know generate the table (no column names, data types, etc).
Gwen On Tue, May 27, 2014 at 10:33 PM, ch huang <[email protected]> wrote: > hi,mailist: > i am testing sqoop2 ,use it to pump my data from HDFS to mysql > database, i do following job ,and i wander f i need create DDL first before > i import data to mysql DB? > > sqoop:000> show connection > +----+--------------+-----------+---------+ > | Id | Name | Connector | Enabled | > +----+--------------+-----------+---------+ > | 1 | mysql import | 1 | true | > +----+--------------+-----------+---------+ > sqoop:000> create job --xid 1 --type export > Creating job for connection with id 1 > Please fill following values to create new job object > Name: mysql export > Database configuration > Schema name: dbadb > Table name: mytest > Table SQL statement: > Table column names: > Input configuration > Input directory: /mytest/helen > Throttling resources > Extractors: > Loaders: > New job was successfully created with validation status FINE and > persistent > id > 1 > sqoop:000> show jobs > The specified function "jobs" is not recognized. > sqoop:000> show job > +----+--------------+--------+-----------+---------+ > | Id | Name | Type | Connector | Enabled | > +----+--------------+--------+-----------+---------+ > | 1 | mysql export | EXPORT | 1 | true | > +----+--------------+--------+-----------+---------+ > sqoop:000> submission start --jid 1 > Exception has occurred during processing command > Unknown command: No such property: start for class: groovysh_evaluate > sqoop:000> start job --jid 1 > Exception has occurred during processing command > Server has returned exception: Exception: java.lang.Throwable Message: > GENERIC_J > DBC_CONNECTOR_0003:Unable to access meta data >
