From: "John Smith" <[EMAIL PROTECTED]>
> On Thu, 2004-11-11 at 14:36, Philippe Poelvoorde wrote:
> > Hi,
> >
> > could you try adding a key with
> > ALTER TABLE properties ADD INDEX(countyid,old,price);
> > It could maybe help getting less rows at a time.
>
> I dropped the old and price for the where clause and the number of rows
> scanned were the same as without the limit which is good so I am
> guessing the 3 coloum index will do the same.

`price` is still in the ORDER BY, so removing it only from the WHERE clause
will not help really.

- create an INDEX on the columns in the WHERE clause _and_ ORDER BY / GROUP
BY, etc. This is the only way to ensure that all data is retrieved using an
INDEX
- the DESC direction will be slower than ASC (but you'll probably need it
anyway)
- the extra speed you could gain from the LIMIT will be removed by the
SQL_CALC_FOUND_ROWS option. This option makes sure that the query will be
executed as if the limit was not present to calculate the number of rows in
the entire result set.
- run OPTIMIZE TABLE regularly to help MySQL optimize execution paths; the
cardinality of the indexes are used to optimize the execution path.

All sites where huge result sets are possible will limit the set no matter
what. Sites like google _estimate_ the number of results.
You could also retrieve the id's of the desired records (with a maximum of
say 500 records) and store the id's in e.g. session data. Paging through the
results will only require you to retrieve the data of 10 or 20 records at a
time.

Hope this will help,

Regards, Jigal


-- 
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to