Yes. Simply replace `--as-avrodatafile` with `--as-parquetfile`.

Please make sure the environment variables HIVE_HOME and HCAT_HOME are set 
correctly.

--
Qian Xu (Stanley)

From: pratik khadloya [mailto:[email protected]]
Sent: Friday, September 12, 2014 10:12 AM
To: [email protected]
Subject: Re: Hive import is not compatible with importing into AVRO format

Oh ok, thanks for the information Xu. Can it be invoked using --as-parquetfile 
with --hive-import ?

Regards,
Pratik

On Thu, Sep 11, 2014 at 6:17 PM, Xu, Qian A 
<[email protected]<mailto:[email protected]>> wrote:
Unfortunately, Avro format is not supported for a Hive import. You can fire a 
JIRA for that. Note that the trunk version of Sqoop1 supports Hive import as 
Parquet.

--
Qian Xu (Stanley)

From: [email protected]<mailto:[email protected]> 
[mailto:[email protected]<mailto:[email protected]>]
Sent: Friday, September 12, 2014 8:56 AM
To: [email protected]<mailto:[email protected]>
Subject: Re: Hive import is not compatible with importing into AVRO format


Hey,there:
 Does hive support the format of avroFile.As I know it just supoort 
rcfile,textfile,sequencefile.Hope this helpful to you.

From: pratik khadloya<mailto:[email protected]>
Date: 2014-09-12 08:26
To: [email protected]<mailto:[email protected]>
Subject: Hive import is not compatible with importing into AVRO format
I am trying to import data from a free form mysql query into hive. I need the 
files to be as AVRO data files, but when i pass the --as-avrodatafile option, i 
get a compatibility error. Is there a way i can tell sqoop to use the avro file 
format?

$ bin/sqoop import -jt <jobtracker> --connect jdbc:mysql://<mydbserver>/<mydb> 
--username <dbuser> --password <dbpwd> --target-dir 
/user/pkhadloya/sqoop/mytable --query “<my query> WHERE \$CONDITIONS" 
--num-mappers 1 --hive-import --hive-table mytable --create-hive-table 
--as-avrodatafile


~Pratik

Reply via email to