Go it! And I will use plain JDBC.

Thanks a lot for all your inputs!


Jeff Butler wrote:
Using JDBC/iBATIS for bulk data loads will usually be slower than using your
database's bulk load utility.  See this page from postgresql dicumentation
for more information:

http://www.postgresql.org/docs/8.1/interactive/populate.html

If you feel you must use iBATIS, then I'd recommend using a batch and
issuing periodic commits.  See the iBATIS developer's guide (page 56) for an
example.  But, this will likely be much slower than using the bulk loader.

Jeff Butler

On Tue, Jul 22, 2008 at 10:52 AM, luy <[EMAIL PROTECTED]> wrote:

Greeting,

I will load millions of data from Oracle10 to postgreSQL8 through JDBC.

I like the spring+ibatis framework. May I know how ibatis deal with big
chunk of data please?

The example I had is the following, would you suggest better solution?

(1)Transaction begins

(2) select from oracle.table1 into object1
   . possible solution is to split table1 into
     table1_subgroupdata1, ...n

(3) for loop object1
   => insert into postgreSQL

(4) if success commit
   else rollback

(5) Transaction ends

Thanks a lot!

--
View this message in context:
http://www.nabble.com/ibatis%2Bspringframework-for-millions-data-loading-tp18592540p18592540.html
Sent from the iBATIS - User - Java mailing list archive at Nabble.com.




Reply via email to