RE: Token offset values for custom Tokenizer

2007-07-16 Thread Ard Schrijvers
Hello, The issue is about lucene 1.9. Can you test it with lucene 2.2? Perhaps the issue is already addressed and solved... Regards Ard > > Thank you for the reply Ard, > > The tokens exist in the index and are returned accurately, except for > the offsets. In this case I am not dealing with

Re: Token offset values for custom Tokenizer

2007-07-16 Thread Shahan Khatchadourian
The issue continues to exist with nightly 146 from Jul 10, 2007. http://lucene.zones.apache.org:8080/hudson/job/Lucene-Nightly/146/ Ard Schrijvers wrote: Hello, The issue is about lucene 1.9. Can you test it with lucene 2.2? Perhaps the issue is already addressed and solved... Regards Ard

Re: Token offset values for custom Tokenizer

2007-07-16 Thread Shahan Khatchadourian
Thank you for the reply Ard, The tokens exist in the index and are returned accurately, except for the offsets. In this case I am not dealing with the positions, so the termvector is specified as using 'with_offsets'. I have left the term position incrememt as its default. Looking at the exist

RE: Token offset values for custom Tokenizer

2007-07-16 Thread Ard Schrijvers
Hello, > Hi, > I am storing custom values in the Tokens provided by a Tokenizer but > when retrieving them from the index the values don't match. What do you mean by retrieving? Do you mean retrieving terms, or do you mean doing a search with words you know that should be in, but you do not fi

Token offset values for custom Tokenizer

2007-07-13 Thread Shahan Khatchadourian
Hi, I am storing custom values in the Tokens provided by a Tokenizer but when retrieving them from the index the values don't match. I've looked in the LIA book but it's not current since it mentioned term vectors aren't stored. I'm using Lucene Nightly 146 but the same thing has happened with