[EMAIL PROTECTED] said:
> 1. My anticipated bottleneck under postgres is that the DB-writing app.
> must parse incoming bursts of data and store in the DB.  The machine
> sending this data is seeing a delay in processing.  Debugging has shown
> that the INSERTS (on the order of a few thousand) is where most of the
> time
> is wasted.

Jason,

You might be better performance simply by wrapping the insert into a
transaction, or wrapping a transaction around a few hundred inserts at a
time.  A transaction is a very expensive operation, and unless you group
your inserts into transactions of several inserts, you pay the transaction
price for each single insert.  That has a devastating impact on
performance no matter what database you're using, so long as it's ACID
compliant.

SQLite is a wonderful tool and absolutely saving my bacon on a current
project, but you can save yourself the trouble of rewriting your database
access by making a slight modification to your code.  This assumes, of
course, that you aren't already using transactions.

Clay Dowling
-- 
Simple Content Management
http://www.ceamus.com

Reply via email to