Hey all, I have yet to run an experiment to test this but was wondering if
anyone knows the answer ahead of time.
If i have an index built with documents before implementing the commongrams
filter, then enable it, and start adding documents that have the
filter/tokenizer applied, will searches that fit the criteria, for example:
"to be or not to be"
will that search still return results form the earlier documents as well as
the new ones?  The idea is that a full re-index is going to be difficult,
so would rather do it over time by replacing large numbers of documents
incrementally.  Thanks,
Dave

Reply via email to