Arthur,
Full source and a small test app are at
http://sourceforge.net/projects/adodotnetsqlite

> -----Original Message-----
> From: Arthur C. Hsu [mailto:[EMAIL PROTECTED] 
> Sent: Tuesday, November 18, 2003 2:22 AM
> To: Tim McDaniel; [EMAIL PROTECTED]
> Subject: RE: [sqlite] In-memory DB performance tuning
> 
> 
> Thanks for the quick response.  Any samples that I can reference?
> 
> -Arthur 
> 
> -----Original Message-----
> From: Tim McDaniel [mailto:[EMAIL PROTECTED] 
> Sent: Monday, November 17, 2003 11:54 PM
> To: [EMAIL PROTECTED]
> Subject: RE: [sqlite] In-memory DB performance tuning
> 
> Arthur,
> 
> I've just done some performance tests using our ADO.NET 
> provider for SQLite. On a 2GHz P4 system, we get about 35000 
> inserts/sec and 175000 reads/sec. This is with a file db 
> using a transaction, or an in-memory db without transaction, 
> they both perform the same.  The performance is linear, I've 
> done 10k, 100k, 500k, and 1 million row tests, the insert and 
> read rate is the same for all.  If you are using the C 
> interface, your results should be comparable (scaled to your 
> system speed of course), especially considering the slight 
> interop hit we take for ADO.NET.
> 
> Tim
> 
> > -----Original Message-----
> > From: Arthur C. Hsu [mailto:[EMAIL PROTECTED]
> > Sent: Tuesday, November 18, 2003 1:16 AM
> > To: 'Steve Dekorte'
> > Cc: [EMAIL PROTECTED]
> > Subject: RE: [sqlite] In-memory DB performance tuning
> > 
> > 
> > Yes I know the Berkeley DB or gdbm solutions out there.
> > However, I need multicolumns and I need more sophiscated 
> feature like 
> > ORDER BY and GROUP BY.
> > 
> > -Arthur
> > 
> > -----Original Message-----
> > From: Steve Dekorte [mailto:[EMAIL PROTECTED]
> > Sent: Monday, November 17, 2003 10:38 PM
> > To: Arthur C. Hsu
> > Cc: [EMAIL PROTECTED]
> > Subject: Re: [sqlite] In-memory DB performance tuning
> > 
> > 
> > On Nov 17, 2003, at 9:55 PM, Arthur C. Hsu wrote:
> > > Any clues that I can further squeeze the performance?  Or the
> > > limitation is by design?  I just can't realize why the
> > first 6000 rows
> > > are amazing fast but later the speed drops down so dramatically.
> > 
> > Hi Arthur,
> > 
> > If you really need performance and can model your data as key/value
> > pairs, then you might consider something like SleepyCat. If 
> I remember 
> > correctly, it can read/write around 50K rows per second.
> > 
> > -- Steve
> > 
> > 
> > 
> > 
> ---------------------------------------------------------------------
> > To unsubscribe, e-mail: [EMAIL PROTECTED]
> > For additional commands, e-mail: [EMAIL PROTECTED]
> > 
> > 
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
> 
> 
> 

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to