True, other tools may be better...

But with a RowHandler, iBATIS could easily meet this requirement.  I've used
iBATIS RowHandlers for ETL type stuff and achieved crazy performance with
batch updates.  15,000 records per second on the Netflix prize data.

Clinton

On Tue, Jul 22, 2008 at 3:51 PM, Christopher Lamey <[EMAIL PROTECTED]>
wrote:

> On 7/22/08 9:59 AM, "Emi Lu" <[EMAIL PROTECTED]> wrote:
>
> > Greeting,
> >
> > I will load millions of data from Oracle10 to postgreSQL8 through JDBC.
> >
> > I like the spring+ibatis framework. May I know how ibatis deal with big
> > chunk of data please?
> >
> > The example I had is the following, would you suggest better solution?
>
> JDBC is not good for this type of thing.  Much better to use database's
> bulk
> tools.  Maybe something like this:
>
> 1) Run this in sql*plus (from the OraFaq -
>
> http://www.orafaq.com/wiki/SQL*Loader_FAQ#Is_there_a_SQL.2AUnloader_to_downl
> oad_data_to_a_flat_file.3F):
>
> set colsep \t
> set echo off newpage 0 space 0 pagesize 0 feed off head off trimspool on
> spool /tmp/oradata.csv
> select col1, col2, col3
>  from tab1
>  where col2 = 'XYZ';
> spool off
>
> 2) Use the psql COPY command:
>
> \set ON_ERROR_STOP true
> copy tab1 (
>    col1, col2, col3
>    )
> from /tmp/oradata.csv
> with
>    delimiter as E'\t'
>    csv
> \g
>
> You could leave out the \t stuff (the ³set colsep...² in Oracle and
> ³delimiter as...² in Postgres) and just use commas.
>
> Do a ³\h copy² in psql to get a full usage for the COPY command.
>
> Cheers,
> Chris
>
>

Reply via email to