One way to get a clue is to try doing this in stages. First start over and
import a much smaller amount of data, say just a 1GB fraction say, see if that
completes, and if it does, how long it takes and other factors like disk and
memory etc. If 1GB doesn't work, start smaller yet, until you have a size that
does work. When something works, next try something double the size and see if
the resource usage is about linear or not. And double it again, etc, and see if
you can get a time/space progression figures. That should help you predict how
long the full 187GB would take were it successful. Or otherwise at some point
you should see the smallest point where the hang occurs. -- Darren Duncan
On 2016-08-03 8:00 PM, Kevin O'Gorman wrote:
I'm working on a hobby project, but the data has gotten a bit out of hand.
I thought I'd put it in a real database rather than flat ASCII files.
I've got a problem set of about 1 billion game positions and 187GB to work
on (no, I won't have to solve them all) that took about 4 hours for a
generator program just to write. I wrote code to turn them into something
SQLite could import. Actually, it's import, build a non-primary index, and
alter table to add a column, all in sqlite3.
The database was growing for about 1-1/2 days. Then its journal
disappeared, the file size dropped to zero, but sqlite3 is still running
100% CPU time, now for a total of 3800+ minutes (63+ hours). The database
is still locked, but I have no idea what sqlite3 is doing, or if it will
ever stop. All partitions still have lots of space left (most of this is
running in a RAID partition of 11 TiB). Here's what I gave to sqlite3 on
my Linux system:
_______________________________________________
sqlite-users mailing list
[email protected]
http://mailinglists.sqlite.org/cgi-bin/mailman/listinfo/sqlite-users