Getting the slow query data in the database was a breeze with mk-query-digest, but now does anyone happen to know of scripts out there that will generate an html page to view the output? This is probably a better question for the maatkit mailing list but figure someone here might have a link.
Thanks, Milan On Sun, Aug 2, 2009 at 11:16 PM, Milan Andric<mand...@gmail.com> wrote: > Nuno, thanks for the tips. I think I will work on getting > mk-query-digest to log to a db table and run it periodically. Sounds > like a very useful thing to have. > > -- > Milan > > On Sun, Aug 2, 2009 at 10:02 AM, <nuno.tava...@dri.pt> wrote: >> Hi Milan, >> >> I can see many ways of accomplish what you want: >> * I'm almost sure mk-query-digest will allow you to do so; >> * Either crop the slow query log for the desired timespan (that's a couple of >> shell scripting commands) and run mk-query-digest against it; >> * Set the query log file to a filename which is a link to /dev/null and set a >> cron script to relink it to a real filename at noon and another to relink it >> to >> /dev/null at 1pm - and then run the scripts you want. >> * In newer versions you can log the slow queries to tables for analysis; >> * Ultimately, you can also try a patched version of mysqldumpslow I was using >> for some time, explained here: >> http://gpshumano.blogs.dri.pt/2009/07/04/analysing-mysql-slow-queries/ >> >> In the last two options you'll be logging to a table, so it will be easy to >> select a timed range of queries for consideration. >> >> Hope that helps, >> -NT >> >> Quoting Milan Andric <mand...@gmail.com>: >> >>> Thanks for the quick replies guys. I won't be pulling queries our of >>> Drupal anytime soon. The optimizations I will do are minimal. Sounds >>> like we might just have to live with mediocre performance for now. >>> >>> I will definitely looks further at maatkit though. I actually ran it >>> mk-query-digest on my slow log and it's pretty amazing output. Lots >>> there but it might take me a while to digest it all. And it does have >>> a time span for the queries it spots, like: >>> >>> # Time range 2009-07-31 06:46:48 to 2009-07-31 12:51:21 >>> >>> Which is pretty helpful, but ideally I would like to pass args to >>> mk-query-digest to list specific queries that were slow between noon >>> and 1pm for example. If anyone happens to know how to that it would >>> be helpful. Otherwise I will keep grokking maatkit tools. >>> >>> Thanks, >>> >>> Milan >>> >>> >>> On Fri, Jul 31, 2009 at 3:14 PM, Johan De Meersman<vegiv...@tuxera.be> >>> wrote: >>> > On Fri, Jul 31, 2009 at 8:14 PM, Milan Andric <mand...@gmail.com> wrote: >>> > Hello, >>> > >>> > I'm serving a burly Drupal install and at some points throughout the >>> > day the mysql threads go way up and iowait peaks. I'm not sure which >>> > is causing which but during this time the server is unresponsive. I >>> > would like to determine if there is a poorly optimized query causing >>> > this. >>> > >>> > Drupal is a disaster :-) >>> > >>> > I'm running a dozen drupals, and we've spent the last eight months heavily >>> > optimizing every aspect. It's reasonably good now, but I'd still like to >>> > move away from it. >>> > >>> > Drupal wants to remain compatible with multiple databases, notably both >>> > MySQL and Postgres, and for v7 also Oracle. This means that they can't >>> > optimize their DB layer and use the fancy tricks of a single DB, but must >>> do >>> > things in ways that are compatible with all of them. Postgres doesn't have >>> > autoincrement ? No autoincrement. Mysql doesn't have sequences ? No >>> > sequences. >>> > >>> > Some points: >>> > - Drupal uses it's own internal unique ID system, not autoincrement, for >>> a >>> > number of things. This means that every insert that uses this locks all >>> > other inserts that use this. >>> > - Drupal has the bothersome tendency to do full table locks around some >>> > inserts. This means that even if you move to InnoDB, you'll still get full >>> > table locks. We patched this out according to info found on drupal.org and >>> > other sites. >>> > - Drupal's innate caching mechanisms are DB bound. Fun and dandy, but >>> > rather irritating if as soon as a busy page expires, a hundred threads all >>> > try to update the same table at the same time. We implemented Memcached. >>> > Patches, at least partial ones, are on drupal.org, but I know we did more >>> > and better in-house. >>> > - Drupal can cache fine for anonymous users, but once you're logged in, >>> > it's nearly impossible to cache. We worked around this by basically making >>> > 'static' cacheable pages, and pulling any dynamic content in through AJAX. >>> > - Get rid of Drupal's views (in drupal, that is, not wat is called a view >>> > in a database). They generate horribly inefficient queries. >>> > - Full text search modules use MySQL's full text indices, but that's not >>> > the best way of doing things. We implemented Sphinx search, there's an >>> > existing drupal module for integration. >>> > >>> > There's lots more that was done, but I can't provide all that info because >>> > a) it's company internal, and b) because I'm not a developer and thus >>> don't >>> > know half of it :-) >>> > >>> > Suffice it to say, I don't like drupal for high-traffic interactive sites. >>> > Get away from it if you can. >>> > >>> > >>> > >>> > >>> >> I'm logging slow queries but is there a way to see when the >>> >> slow queries take place also? I'd like to know what queries are being >>> >> processed during this window of poor response time, usually around >>> >> noon local time. >>> > >>> > If you want more than just the slow queries, you'll have to enable the >>> full >>> > log. Be aware that this has noticeable impact on performance, and will >>> spam >>> > your disks. Log on different spindles if able, and monitor your disk usage >>> > carefully - mysql stops functioning if it can't write logs. >>> > >>> > >>> > >>> > >>> > -- >>> > Celsius is based on water temperature. >>> > Fahrenheit is based on alcohol temperature. >>> > Ergo, Fahrenheit is better than Celsius. QED. >>> > >>> >>> -- >>> MySQL General Mailing List >>> For list archives: http://lists.mysql.com/mysql >>> To unsubscribe: http://lists.mysql.com/mysql?unsub=nuno.tava...@dri.pt >>> >>> >> >> >> -- >> Nuno Tavares >> +351 93 618 40 86 >> dri Consultoria Informatica >> >> > -- MySQL General Mailing List For list archives: http://lists.mysql.com/mysql To unsubscribe: http://lists.mysql.com/mysql?unsub=arch...@jab.org