I proposed such a program earlier in this discussion. I would envisage a seperate program which strips out a list of keys from the database, sorts it then allocates space in the DB file for the resulting index and builds it bottom up. It would be an off-line process but fast and would make raising indices on large databases time efficient.

Based on our experience of building a B-Tree with such a program compared to successive insertions a speed improvement in raising an index of at least an order of magnitude could be expected.

By making it an independent program it can be lean, mean and fast and not touch the regular Sqlite library.

Stephen Toney wrote:
On Wed, 2007-03-28 at 08:23 -0600, Dennis Cote wrote:


It might make sense to create a separate standalone utility program (like sqlite3_analyzer) that reuses some the sqlite source to do bulk inserts into a table in a database file as fast a possible with out having to worry about locking or journaling etc.


That would solve my problem too (thread: "CREATE INDEX performance" on
indexing a 5.8-million record table). I'd love something like that!




-----------------------------------------------------------------------------
To unsubscribe, send email to [EMAIL PROTECTED]
-----------------------------------------------------------------------------

Reply via email to