Thanks Gwen Shapira,
even after leaving columns blank, i got some exceptions


Error: org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0017:Error occurs 
during extractor run 


org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0012:The type is not 
supported - java.sql.Timestamp 


org.apache.sqoop.common.SqoopException: MAPRED_EXEC_0013:Cannot write to the 
data writer




After this i commented out number of extractors and loaders, then its working 
fine.


But the table which is being imported is created in hdfs with 10 parts
like part-m-00000 to part-m-00009


Cant we reduce this number of parts.?


And also can i have better hive query which is used to load this data to HIVE. 
Or any connector supporting to load data to hive.




Please suggest me


Thanks Again

---- On Thu, 05 Feb 2015 23:38:48 +0530 Gwen 
Shapira<[email protected]> wrote ---- 

Specify the table name input and leave columns blank.

On Thu, Feb 5, 2015 at 2:29 AM, Syed Akram <[email protected]> 
wrote:

Hi,

      Iam using sqoop1.99.3, and iam importing data from mysql to hdfs, 
I want to import each table with all the columns to hdfs with out specifying 
the column names,


 connectorForm.getStringInput("table.columns").setValue("*");


something like above mentioned,


Please suggest me 


thanks in advance





 







Reply via email to