Hi there,

I used to work with a table containing over 2.5 million cities. This
was verry slow even using indexes. So I did split the huge table into
250 tables named after their country. Example: gm_cities for the german city
table.
Now performance is ok, because my interface knows in which table
to look at.

A new problem arised. There is a need for a search on all tables to find a
specific
city. To find this city even in one table takes 0.5s which is a lot if you
put it times 250 tables.
I have to admit that there is no index on the city column yet. I tryed to
put a full text index
on the city column, but this busted up the table size from 2.5Mb to 7.5MB
which made
the querry take 1s.

My questions are as follows:

1. whats wrong with this one table having 7.5Mb instead of 2.5 I did delete
the index and optimized
the table but still 7.5

2. What would be the best way to organize this data to make the search on
this 2.5 mio entrys to
take not longer than 0.5s? I could put it all into one table again, but this
would meen a week of work
because my application is based on this system. Maybe a good hint for
putting the right index
would be the best help for me.

My tablestrukture looks like this:

ID     ->    mediumint(9) unique index
city   ->    char(100)
province_id    -> char(2) index

A nother story would be that it might take even longer because I have to get
the province name and the country name
out of the table countries and provinces

Thank you for any help,

Andy



-- 
PHP Database Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to