I have a php script that goes though a 60,000 row table (currently a 100mb
sqlite 3 format database that could grow to 200mb) and looks for rows that
are missing data.  Then the script retrieves data (whois queries and
traceroute results) and inserts it into the table.  I want to run
many occurrences of this script to populate the database quicker, but I am
getting into trouble with the database being locked constantly if I try to
run only two instances of the script.

The faq indicates concurrency really isn't necessary for most situations
with the overall speeds of today's computers.  However, my server is a 12
year old ibm pc running linux with 64m of memory.   The harddisk speed is
very slow.  Is this going to prohibit running more than one process that
writes to a single database table?

I am new to databases so perhaps there is something very basic I am
overlooking here.  Wanted to give sqlite a real effort before trying mysql.

Thanks,
Chad
_______________________________________________
sqlite-users mailing list
sqlite-users@sqlite.org
http://sqlite.org:8080/cgi-bin/mailman/listinfo/sqlite-users

Reply via email to