What tokenizer are you using? I think, but I'm not entirely sure, that
this would require a bug in a tokenizer.


On Tue, Jun 9, 2015 at 10:21 AM, Ryan, Michael F. (LNG-DAY)
<michael.r...@lexisnexis.com> wrote:
> I'm using Solr 4.9.0. I'm trying to figure out what would cause an error like 
> this to occur a rare, non-deterministic manner:
>
> java.lang.IllegalStateException: TokenStream contract violation: close() call 
> missing
>         at org.apache.lucene.analysis.Tokenizer.setReader(Tokenizer.java:90)
>         at 
> org.apache.lucene.analysis.Analyzer$TokenStreamComponents.setReader(Analyzer.java:307)
>         at org.apache.lucene.analysis.Analyzer.tokenStream(Analyzer.java:183)
>
> Are there any known bugs that would cause this, or unusual conditions? I'm 
> thinking crazy things like a corrupted index, or a hardware issue.
>
> I don't directly use TokenStream, so I'm wondering if there is something that 
> could indirectly cause this (i.e., me doing something wrong that causes 
> Lucene itself to not close the TokenStream).
>
> I can provide more details later. Right now I'm just grasping at straws, 
> hoping someone has encountered this.
>
> -Michael

Reply via email to