Oh ok, thanks for the information Xu. Can it be invoked using
--as-parquetfile with --hive-import ?

Regards,
Pratik

On Thu, Sep 11, 2014 at 6:17 PM, Xu, Qian A <[email protected]> wrote:

>  Unfortunately, Avro format is not supported for a Hive import. You can
> fire a JIRA for that. Note that the trunk version of Sqoop1 supports Hive
> import as Parquet.
>
>
>
> --
>
> Qian Xu (Stanley)
>
>
>
> *From:* [email protected] [mailto:[email protected]]
> *Sent:* Friday, September 12, 2014 8:56 AM
> *To:* [email protected]
> *Subject:* Re: Hive import is not compatible with importing into AVRO
> format
>
>
>
>
>
> Hey,there:
>
>  Does hive support the format of avroFile.As I know it just supoort
> rcfile,textfile,sequencefile.Hope this helpful to you.
>
>
>
> *From:* pratik khadloya <[email protected]>
>
> *Date:* 2014-09-12 08:26
>
> *To:* [email protected]
>
> *Subject:* Hive import is not compatible with importing into AVRO format
>
> I am trying to import data from a free form mysql query into hive. I need
> the files to be as AVRO data files, but when i pass the --as-avrodatafile
> option, i get a compatibility error. Is there a way i can tell sqoop to use
> the avro file format?
>
>
>
> $ bin/sqoop import -jt <jobtracker> --connect 
> jdbc:mysql://<mydbserver>*/*<mydb> --username
> <dbuser> --password <dbpwd> --target-dir /user/pkhadloya/sqoop/mytable
> --query “<my query> WHERE \$CONDITIONS" --num-mappers 1 --hive-import
> --hive-table mytable --create-hive-table --as-avrodatafile
>
>
>
>
>
> ~Pratik
>
>

Reply via email to