Hi,
thanks for your reply. In several other implementations I’ve seen this pattern
of using a while(input.incrementToken()) within the filter’s incrementToken
method. Is this approach recommended or are there hidden traps (eg: memory
consumption, dependency on filter ordering and so on)
Best
Hi,
LimitTokenCountFilter is used to index first n tokens. May be it can inspire
you.
Ahmet
On Friday, April 21, 2017, 6:20:11 PM GMT+3, Edoardo Causarano
wrote:
Hi all.
I’m relatively new to Lucene, so I have a couple questions about writing custom
filters.
The way I understand it, one woul
Hi all.
I’m relatively new to Lucene, so I have a couple questions about writing custom
filters.
The way I understand it, one would extend
org.apache.lucene.analysis.TokenFilter and override #incrementToken to examine
the current token provided by a stream token producer.
I’d like to write so