I was pulling out my hair trying to figure out why the insertion into a table
was becoming extremely slow as a table grew.
I was creating a new table and populating it with 100,000 rows of data (as a
test case; I really wanted to populate it with over a million rows).
[Insertion A] When a Text Column was NOT Unique it would take:
8875 ms = ~9 seconds
[Insertion B] When a Text Column was Unique it would take:
155781 ms = ~156 seconds
Insertion B seemed to be IO/Disk bound and the CPU Utilization drops to around
4-8% once the table reaches a certain size. The amount of time it takes as the
table increases it exponential, which is the main problem.
Is there anything I can do to speed up the insertion when the Text Field is
Unique?
My solution so far has been to put the database file in a RAMDisk, to reduce
the IO/Disk bottleneck, but once the database grows too large, this won't be an
option.
I'm also using System.Data.SQLite, although from the changelog for sqlite I
don't see anything fixed that would resolve the issue.
Thank you for your help.
_______________________________________________
sqlite-users mailing list
[email protected]
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users