Hi David, Thanks for you prompt reply. I'll try to answer your questions to the best I can currently. Please see my replies below.
David Griffiths wrote:
So your application tracks incoming HTTP-GETS.
When you say that it's not able to "capture" all 1000 entries, what do you mean? Does an exception get thrown? Do some of the HTTP-GETs just not show in the database?
All I can verify currently that all the 1000 entries reached the server fine (determined through viewing the apache logs). Whether some of the entries got an exception from MySQL I can't determine. Is there some MySQL logs like that of apache where I can look at transactions?
You need to provide alot more information:
Do all the HTTP-GETs happen on the same connection?
If I get you correctly, all the HTTP-GETS happen on different connections.
How long do the HTTP-GETs take to process? 10 seconds?
takes less than a second as for testing, I've stripped the code to just log down into the database without even doing any processing
What hardware are you running on? CPU, disk, memory. Is the machine dedicated to MySQL?
Hmm.... CPU speed and harddisk, I can't remember but memory is 1GB. Also during the test, the server (Linux 9) is not doing anything that might hog up memory or cpu usage.
What's the MySQL CPU load on the above hardware during your test
From what I observed, it's nowhere near 50%
What table type (InnoDB, MyISAM, BDB, etc)?
I would think it is MyISAM currently
What tuning have you done to the my.cnf, and are you sure that MySQL is using that my.cnf (ie is it in the correct location)?
Tunings that I have tweaked cuurently are
1. join_buffer_size 131072
2. key_buffer_size 16773120 3. max_connections 300
Hope some of these info helps.
Thanks!
----- Original Message ----- From: "mysql" <[EMAIL PROTECTED]> To: <[EMAIL PROTECTED]> Sent: Thursday, January 15, 2004 9:23 AM Subject: MySQL Performance Tuning?
Hi Gurus, I'm currently building an application which is expected to take very high loads. What the app does is essence is to 'log' and incoming entry into MySQL, do something then updates the 'log' entry.
To test MySQL in handling high load, I used siege on another server to send 1000 HTTP GET requests to my php script which then does as described above. The results that I'm getting is not encouraging as it seems that MySQL is not able to capture the 1000 entries.
I've tried doing some of the tuning from the net but so far to no avail. Does anyone know what is the critical tuning method needed for MySQL to be able to handle loads like this?
Thank you very much!
-- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/[EMAIL PROTECTED]
-- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/[EMAIL PROTECTED]