Ok, im working on a dynamic site with php and mysql. I allow people to
upload csv files that I process via cron job and php script to get the data
from the files into a table in my DB.
My first thought was to read through each file line by line and do an
insert for each line checking for validity of content prior to the insert.
This works fine for 200k records but after that I mysql gets fussy and
gives me 127 errors when I do selects etc and downright blows up if I get
over 1 million records.....this is all with a good test file for the
import. I tried flush tables after every file import which increased the
script run time 5x and I still had the same problem.
My question now is, should I come at this from a different angle, read
through the import files, validate the content and write the content to a
new file and then use "load data infile" for one big input file or perhaps
just validate the contents of the uploaded files and then do a "load data
infile" for each of them?
Anyways, my goal is to get around the apparent insert limitations I am
running up against.
---------------------------------
Larry Hotchkiss
---------------------------------------------------------------------
Before posting, please check:
http://www.mysql.com/manual.php (the manual)
http://lists.mysql.com/ (the list archive)
To request this thread, e-mail <[EMAIL PROTECTED]>
To unsubscribe, e-mail <[EMAIL PROTECTED]>
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php