All,
I realize that we should be consuming all tokens from a stream. I'd like to
wrap a client's Analyzer with LimitTokenCountAnalyzer with consume=false. For
the analyzers that I've used, this has caused no problems. When I use
MockTokenizer, I run into this assertion error: end() called
On Fri, Nov 1, 2013 at 9:30 AM, Allison, Timothy B. talli...@mitre.org wrote:
Disabling assertions gives me pause as does disobeying the workflow
(http://lucene.apache.org/core/4_5_1/core/index.html). I assume from the
warnings that there are Analyzers and use cases that will fail unless the