--- PokerAce <[EMAIL PROTECTED]> wrote: > I'm trying to see if SQLite is suitable for large databases ( > 1 gb, > millions of rows in each of several tables). Initially, the memory usage > was outrageous (~ 500 mb for a 1.3 gb db), but I got that down to < 30 mb by > setting the cache size to 0 and setting a low soft heap limit. That works > when I'm reading from the database, but when I am inserting these rows, the > memory usage grows back into the ~500 mb range. My goal is to never have > the application use more than 100 mb of memory, preferably much less than > that. Does anyone have any suggestions?
If your cache_size is 0 I'm not sure what's eating up 500M of RAM for inserts. Some questions that might give you some ideas: Are you certain it's sqlite RAM, and not your application? Is your temp_store set to memory or file? How are you performing your inserts (prepared statements)? How many rows are you inserting per batch? What's your database page_size? Can you build your table indexes after you populate the data? Is this for a poker showdown database by any chance? http://games.cs.ualberta.ca/poker/IRC/ __________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com ----------------------------------------------------------------------------- To unsubscribe, send email to [EMAIL PROTECTED] -----------------------------------------------------------------------------