Chris,

I would consider loading this script only once and establishing a single connection to the DB server and then making it loop with a time delay waiting for data to be inserted into the DB - that's pretty much what it does, right?

The thing is that if you invoke the script every time you have a new portion of data to be uploaded, first you have to start Perl, then have it load the script, then establish a new connection to the DB server - all these steps are totally redundant, but they do overload your box.

Hope this helps.
Vadim.

Chris wrote:

Hi Guys,

Iv got a script that iv been working on written in perl which interfaces to
mysql via the dbi module..

The script only does two selects within the entire script, so I didn't think
it would be too taxing on the machine.

Turns out that when the script is executed around 100 times a minute, the
load on the machine skyrockets (load average around 42.10). Obviously this
is not good, so my question is, where can I start on optimizing mysql for
high usage (if you can even call it that)?

Basically the first select statement goes and selects a column value from a
row that matches another column based on a username string.

so for example: select data from mytable where username='theuser'

then the second statement does:

insert into myothertable values('blah','blah')

"mytable" is small, no more than like 140 rows.
"myothertable" is large, and the table where all data gets dumped to.

As you can see, this is pretty basic. I never did much with queries being
sent this fast (100+ a minute) so any advise is welcome.

Thanks.







-- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/[EMAIL PROTECTED]



Reply via email to