Well, you can create an empty Hive table in Orc format and use --hive-override 
in sqoop

Alternatively you can use --hive-import and set hive.default.format

I recommend to define the schema properly on the command line, because sqoop 
detection of formats is based on jdbc (Java) types which is not optimal. For 
example, a decimal(20,2) in oracle should be a decimal(20,2) in hive and not a 
double.

This may also a good opportunity to review the database model in general. 
Especially for keys one should use a numeric type and not varchar. This saves a 
lot of space in the column and can be looked up much faster. 

> On 31 Jan 2016, at 14:15, Ashok Kumar <ashok34...@yahoo.com> wrote:
> 
> Thanks,
> 
> Can sqoop create this table as ORC in Hive?
> 
> 
> On Sunday, 31 January 2016, 13:13, Ashok Kumar <ashok34...@yahoo.com> wrote:
> 
> 
> Thanks.
> 
> Can sqoop create this table as ORC in Hive?
> 
> 
> On Sunday, 31 January 2016, 13:11, Nitin Pawar <nitinpawar...@gmail.com> 
> wrote:
> 
> 
> check sqoop 
> 
> On Sun, Jan 31, 2016 at 6:36 PM, Ashok Kumar <ashok34...@yahoo.com> wrote:
>   Hi,
> 
> What is the easiest method of importing data from an Oracle 11g table to Hive 
> please? This will be a weekly periodic job. The source table has 20 million 
> rows.
> 
> I am running Hive 1.2.1
> 
> regards
> 
> 
> 
> 
> 
> -- 
> Nitin Pawar
> 
> 
> 
> 

Reply via email to