""blackwater dev"" <[EMAIL PROTECTED]> wrote in message news:[EMAIL PROTECTED]
I have a text file that contains 200k rows. These rows are to be imported into our database. The majority of them will already exists while a few are
new.  Here are a few options I've tried:

I've had php cycle through the file row by row and if the row is there,
delete it and do a straight insert but that took a while.

Now I have php get the row from the text file and then to array_combine with
a default array I have in the class so I can have key value pairs.  I then
take that generated array and do array_diff to the data array I pulled from
the db and I then have the columns that are different so I do an update on
only those columns for that specific row.  This is slow and after about
180,000 rows, php throws a memory error. I'm resetting all my vars to NULL
at each iteration so am not sure what's up.


Anyone have a better way to do this? In MySQL, I could simply a replace on
each row...but not in postgres.

Thanks!


Couldn't you just use PHP to rewrite the 200k row text file from just text, to actual insert commands? Once you have that, most mysql interfaces have an import ability where you can give it a file which is a list of commands (insert, update, etc.) and it will run them itself. That should be much faster.

- Dan
--
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to