Hi Guys,

Iv got a script that iv been working on written in perl which interfaces to
mysql via the dbi module..

The script only does two selects within the entire script, so I didn't think
it would be too taxing on the machine.

Turns out that when the script is executed around 100 times a minute, the
load on the machine skyrockets (load average around 42.10). Obviously this
is not good, so my question is, where can I start on optimizing mysql for
high usage (if you can even call it that)?

Basically the first select statement goes and selects a column value from a
row that matches another column based on a username string.

so for example: select data from mytable where username='theuser'

then the second statement does:

insert into myothertable values('blah','blah')

"mytable" is small, no more than like 140 rows.
"myothertable" is large, and the table where all data gets dumped to.

As you can see, this is pretty basic. I never did much with queries being
sent this fast (100+ a minute) so any advise is welcome.

Thanks.


-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to