> Concurrent updates and mysql aren't the best of friends, yet. (Perhaps DBD
> and/or that new geminii stuff can fix some of your problems? Although
> geminii will probably cost you...)
> 
> A lot of the slow queries can be solved by making better tables, sometimes
> even de-normalizing them if that can prevent a join, by sticking the numeric
> data into seperate tables (variable length stuff kills performance), etc.
> Standard procedure :)

We know. We've been through this. The problem with MySQL is that
concurrent updates can't use row locking, but table locking. That means
that the whole table will be locked until the updates are finished. This
isn't good. Row locking based on the primary key would be A LOT faster.

> But in the end, if you have tons of concurrent updates then you might want
> to consider alternatives to MySQL.
> (postgres, frontbase, etc.)

Indeed. We might give it a try someday. Although they don't have LIMIT
(which provides a nice "paging" system). We use LIMIT everywhere here.
It's a nice functionality if you're doing searches all the time.



-- 
Leonardo Dias
Catho Online
WebDevelopper
http://www.catho.com.br/

---------------------------------------------------------------------
Before posting, please check:
   http://www.mysql.com/manual.php   (the manual)
   http://lists.mysql.com/           (the list archive)

To request this thread, e-mail <[EMAIL PROTECTED]>
To unsubscribe, e-mail <[EMAIL PROTECTED]>
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php

Reply via email to