Swaroop C H wrote:

On Tue, 25 Jan 2005 20:43:54 +0000, Daniel Bowett
<[EMAIL PROTECTED]> wrote:


I have just started playing around with MySQLdb for a project I am planning.

As a test I have written a script that executes 3000 insert statements
on a table. The table contains 10 fields with a mix of text and numbers
- its a product table for a website eg UPC, ProductName, Price etc.

The problem I have is that it takes just over two minuted to execute the
3000 insert statements which seems really slow! I am running it on a
machine with a 1.5 Ghz Pentium M Processor and Gig Of Ram. I dont think
the machine is to blame for the speed because during execution the
processor sits at about 10% and there is loads of free RAM.



I think a better option is to have the rows in a text file and then use `mysqlimport` or 'load data infile' SQL[1]

[1]: http://dev.mysql.com/doc/mysql/en/load-data.html



I remember using that a while ago when I dabbled with mysql so I think thats a potential solution. With my current solution I get full error logging with each row i try to insert. Will mysqlimport fail completeley if one row fails?
--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to