I have an index with a few hundred thousand records. The index is generally very fast, with sub 100ms responses. However if I start adding records, it gets extremely slow, up to over 2 seconds per query. This is true even if I am not currently indexing until I optimize the index.
In order to work around this, I index in bulk and immediately optimize. This is not ideal for the performance of my site. Unfortunately, contrary to what Dave Balmain seems to say here: http://osdir.com/ml/lang.ruby.ferret.general/2006-08/msg00037.html , the index seems to be locked for reading during optimization. So I have two questions: 1) Why does the performance degrade so badly after adding just a few records, unless I optimize the index? Can I avoid this? 2) Can I keep a second index so that it doesn't get locked during optimization and then switch to the optimized index? Perhaps the index is not really locked and it is just using all the CPU? (I am using a single CPU server)? Thanks for any help. -Alex _______________________________________________ Ferret-talk mailing list [email protected] http://rubyforge.org/mailman/listinfo/ferret-talk

