I second the question.  It could also reduce the size of the
fulltext index and the time taken to update it.

-steve

> On Thursday 07 February 2002 20:53, Brian  wrote:
> > Has anyone made a suggestion or thought about ways to distribute
> > databases which focus on fulltext indexes?
> >
> > fulltext indexes do a good job of indexing a moderate amount of data,
> > but when you get a lot of data to be indexed, the queries slow down
> > significantly.
> >
> > I have an example table, with about 90 million rows.. and has a fulltext
> > index on a varchar(100) field. A single-word query which would return
> > approx 300k results takes an average of 15 seconds. A query with smaller
> > results (~ 10k) can be as quick as 1 sec.. which I would consider
> > acceptable.
> >
> > Has any thought about splitting the data into distributed files or even
> > machines? ie: something as simple as 'words' starting with 'X' are split
> > into a-h, i-p, q-z... or something more advanced? (maybe mysqld could
> > automatically split results based on (#results per unique 'word' /
> > desired # of 'split files/machines') Would such a system give any
> > advantages to searching speed and concurrenct query scalability? I
> > haven't looked at the fulltext internals.. so I don't know if such
> > "query routing" could take place or not.
> >
> > If nothing else, does anyone else have experience with a table of this
> > size or even larger? What kind of tuning have you done?

---------------------------------------------------------------------
Before posting, please check:
   http://www.mysql.com/manual.php   (the manual)
   http://lists.mysql.com/           (the list archive)

To request this thread, e-mail <[EMAIL PROTECTED]>
To unsubscribe, e-mail <[EMAIL PROTECTED]>
Trouble unsubscribing? Try: http://lists.mysql.com/php/unsubscribe.php

Reply via email to