On Tue, Jun 22, 2004 at 01:34:39PM -0400, Aram Mirzadeh wrote:
> 
> We have an internal SNMP monitoring system that is monitoring about
> 10,000 devices.  Each device is pinged then pulled for about an
> average of 25-30 elements.  Each of the ping results and elements
> are then stored in text file, then another system picks them up
> (NFS) and inserts them into a MyISAM (3.23.54) database.  The data
> is kept for 13 weeks.
> 
> The database system is a Xeon 4 way, 12GB of ram with a striped raid
> array dedicated to the database files and its indexes and such.
> 
> Every 5 minutes another process goes through the last set of inserts
> and compares them for any threshold breaches, so the entire last set
> of data is looked at.
> 
> We're falling behind on the inserts because the system can't seem to
> handle the amount of inserts, the front end that generates the web
> pages based on the previous records is dogging down.
> 
> I have read the regular optimizations papers and have done as much
> as I felt safe, are there any huge database optimization papers?
> Anything I should be looking at?

I'd consider bulking up the INSERTs, performing multi-row INSERTs
rather than doing them one by one.  That can speed things up quite a
bit in my experience.

Jeremy
-- 
Jeremy D. Zawodny     |  Perl, Web, MySQL, Linux Magazine, Yahoo!
<[EMAIL PROTECTED]>  |  http://jeremy.zawodny.com/

[book] High Performance MySQL -- http://highperformancemysql.com/

-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to