On Tue, Jun 28, 2005 at 04:15:15PM -0400, Patrick Dunnigan wrote:
> I am currently using SQLite to process @ 400 million records (and climbing) 
> a day by reading files, importing them into SQLite, and summarizing. The 
> summed data goes into Oracle. This is a production application that is very 
> stable. Holding the data in SQLite in memory as opposed to a C struct 
> reduced development time and makes it easy to change the summarization 
> output by just modifying the SQL statements.

I would also be interested in hearing about your solution.  We have an
internal custom ETL solution that works quite well, but I've been toying
with the idea of migrating to using SQLite db's as the unit of work
for transformations instead of flat files.

Are you summarizing in batches of 1 million records?  All of them each
day?  If you could give a brief description of your workflow I would be
quite interested.

enjoy,

-jeremy

-- 
========================================================================
 Jeremy Hinegardner                              [EMAIL PROTECTED] 

Reply via email to