Hi,
no - adding a limit doesn't really help.

thanks,
- Mark

On Thu, 28 Aug 2003 16:12:41 -0400, John Larsen wrote:
>Mark wrote:
>Why don't you just always put a limit 1000 on it, do you ever need
>more
>than that?
>
>>Hi,
>>I have a fulltext index on a table with 80,000 rows. when I do a
>>search for a common word it is very slow, for example:
>>
>>select count(*) from resources where match title,keywords
>>against('common word');
>>
>>might take over a minute if there are a 5,000 or so rows that
>>match.
>>I'm looking for a way to speed this up and I'm thinking about
>>adding
>>as stop words any word that occurs in more than 1000 records. is
>>there a way to find these? or is there something else someone can
>>suggest?
>>
>>here are some of my variables:
>>ft_boolean_syntax               | + -><()~*:""&
>>ft_min_word_len                 | 4
>>ft_max_word_len                 | 254
>>ft_max_word_len_for_sort        | 20
>>ft_stopword_file                | (built-in)
>>key_buffer_size                 | 268435456
>>myisam_sort_buffer_size         | 67108864
>>
>>here is my table size on disk:
>>-rw-rw----    1 mysql    mysql        8976 Aug 27 10:20
>>resources.frm
>>-rw-rw----    1 mysql    mysql    134471560 Aug 28 09:33
>>resources.MYD
>>-rw-rw----    1 mysql    mysql    61629440 Aug 28 10:23
>>resources.MYI
>>
>>any tips are appreciated.
>>thanks,
>>- Mark
>>
>>
>>
>>
>
>




--
MySQL General Mailing List
For list archives: http://lists.mysql.com/mysql
To unsubscribe:    http://lists.mysql.com/[EMAIL PROTECTED]

Reply via email to