Gavin D'mello wrote:

> I'm trying to do a bulk insert using DBI and Oracle for about 248 rows,
> this is proving to be pretty slow and expensive. Is there anyway where I can
> do a bulk insert ? I am using prepare_cached and execute with parameters
> ......

SQL*Loader is extremely fast if you can fit your problem into its world.
Sometime it's easy to use Perl to preprocess input to create a SQL*Loader
friendly file.

For Perl/DBI, I get good results playing around with my commit size.  Rather
than using the default AutoCommit which commits each operation, I set AutoCommit
off, then gather my inserts up to a given size before explicitly committing.
I typically make this a configurable parameter for anything I bulk load with
any frequency.  There's a point where you can make this too big; i.e., if you
try to commit everything at the end of a big bulk load it will take long.  But
with 248 rows you might just try setting AutoCommit to zero for starters, then
calling $dbh->commit() at the end.

-- 
Steve Sapovits
Global Sports Interactive
http://www.globalsports.com
Work Email:  [EMAIL PROTECTED]
Home Email:  [EMAIL PROTECTED]
Work Phone:  610-491-7087
Cell Phone:  610-574-7706

Reply via email to