At 12:34 PM 6/22/2004, you wrote:
We have an internal SNMP monitoring system that is monitoring about 10,000 devices. Each device is pinged then pulled for about an average of 25-30 elements. Each of the ping results and elements are then stored in text file, then another system picks them up (NFS) and inserts them into a MyISAM (3.23.54) database. The data is kept for 13 weeks.
The database system is a Xeon 4 way, 12GB of ram with a striped raid array dedicated to the database files and its indexes and such.
Every 5 minutes another process goes through the last set of inserts and compares them for any threshold breaches, so the entire last set of data is looked at.
We're falling behind on the inserts because the system can't seem to handle the amount of inserts, the front end that generates the web pages based on the previous records is dogging down.
Have you tried "Load Data Infile"? It is for loading data from a text file into a table and is much faster than using "Insert ..." statements. For example, I can load 1 million rows of x(30) into a MyISam table in 15 seconds on a P4 2.4ghz machine. You can use either Ignore or Replace to handle duplicate indexes.
Mike
-- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/[EMAIL PROTECTED]