I like using compound indexes, but they make the database
VERY big, and slow to load.  The last 2 million entries took over
24 hours to load using mysqlimport, normally a fast program.
The data file has swollen from 580M to 2G.
The index file from 180M to over 3G.

I can't help but think that a 3G index file defeats
the purpose of speedy indexing; for one thing, I can't
keep it all in memory at once anyway!

I need the collective experience & brains of the group for this one.
My app is a phone book -- I need to search by name, city, province, etc.
But there are over 22 million entries.

The indexes i'm using look like this:

ALTER TABLE newdb.White ADD INDEX phone_no (phone_no);
ALTER TABLE newdb.White ADD INDEX lfc 
(last_name(12),first_name(8),city(12));
ALTER TABLE newdb.White ADD INDEX lcs 
(last_name(12),city(12),street_short(10));
ALTER TABLE newdb.White ADD INDEX lps 
(last_name(12),province,street_short(10));
ALTER TABLE newdb.White ADD INDEX loc (city,street_no(10));
ALTER TABLE newdb.White ADD INDEX postal_code (postal_code);

Is there some guideline to a trade-off between index selectivity
and size here?  Should I, for example, just use 3 characters
of each field instead of 8 or 12?  These index files are just too huge.

Hints?

Steve

-- 
Steve Rapaport
World Citizen


---------------------------------------------------------------------
Before posting, please check:
   http://www.mysql.com/manual.php   (the manual)
   http://lists.mysql.com/           (the list archive)

To request this thread, e-mail <[EMAIL PROTECTED]>
To unsubscribe, e-mail <[EMAIL PROTECTED]>
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php

Reply via email to