Hi all, mysql-team
i have a very very complicated problem , hope to be clear.
  i have a file report (.txt) with about 200,000 data about books (title, 
clasif, year, publisher, authors, subjects, etc etc) without tabs or 
separation, i built a parser to split each field and works fine.
  After each sequential parsing (one book) i upload that info to the 
database but in 5 tables (verifying duplication, ids, etc):
                        book
                        author
                        subject
                        book-author (relationship)
                        book-subject (relationship)

  When i reach the book number 11500 (aprox) mysql shows a deepth fall in 
performance , each insertion takes 30 segs or more, even with/without 
indexes.
  First i though that parsing takes longer, so i parsed the entire file and 
generated a java serialized file, then just inserted to db but was 
unsuccessfully , still the same performance.

        I have calculated the total number of rows for each table and is about 
200,000 for books, 150,000 for authors , 130,000 for subjects and relations 
can have 250,000 each one.

  Any have an idea of how can i upload this information quickly ???

  I'll really appreciate any advise.

Carlos

_________________________________________________________________
Get your FREE download of MSN Explorer at http://explorer.msn.com/intl.asp


---------------------------------------------------------------------
Before posting, please check:
   http://www.mysql.com/manual.php   (the manual)
   http://lists.mysql.com/           (the list archive)

To request this thread, e-mail <[EMAIL PROTECTED]>
To unsubscribe, e-mail <[EMAIL PROTECTED]>
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php

Reply via email to