Splitting your data into 50 tables based on states I think is a terrible idea. What if you are looking for a name? You then have to search 50 tables.
30 seconds to search ~5 million records is a very long time. With the proper index, it should be almost immediate. Please email the result of this command:
show index from usa_phone_list
I'd also be curious how long this command takes:
select state,count(*) from usa_phone_list group by state
If my thinking is correct, that query should take considerably longer than the one you asked. If it takes about the same amount of time, then you don't have a proper index. The query above needs to read the entire table to compile the stats, your query should just need to read the records that are NJ if it really is using the index correctly.
Also, you could try running this command:
analyze table usa_phone_list
Your table will be locked during this process and I don't know how long it will take. But this will tell MySQL to analyze the data distribution in the table so it can better optimize the queries.
On Thursday, October 17, 2002, at 10:08 PM, James Taylor wrote:
Currently, I'm using 1 large table to store basically everything. I'm
trying to figure out whether I should use some sort of normalization, and
also I was thinking of basically creating 50 different tables, one for each
state...
-- Brent Baisley Systems Architect Landover Associates, Inc. Search & Advisory Services for Advanced Technology Environments p: 212.759.6400/800.759.0577
--------------------------------------------------------------------- Before posting, please check: http://www.mysql.com/manual.php (the manual) http://lists.mysql.com/ (the list archive) To request this thread, e-mail <[EMAIL PROTECTED]> To unsubscribe, e-mail <[EMAIL PROTECTED]> Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php