Dear all,

I'm presently trying to import the full wikipedia dump for one of our research users. Unsurprisingly it's a massive import file (2.7T)

Most of the data is importing into a single MyISAM table which has an id field and a blob field. There are no constraints / indexes on this table. We're using an XFS filesystem.

The import starts of quickly but gets increasingly slower as it progresses, starting off at about 60 G per hour but now the MyISAM table is ~1TB it's slowed to a load of about 5G per hour. At this rate the import will not finish for a considerable time, if at all.

Can anyone suggest to me why this is happening and if there's a way to improve performance. If there's a more suitable list to discuss this, please let me know.

Regards

Simon

--
Dr Simon Collins
Data Grid Consultant
National Grid Service
University of Manchester
Research Computing Services
Kilburn Building
Oxford Road
Manchester
M13 9PL

Tel 0161 275 0604


--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to