Hi,

I just realized that since I upgraded from Lucene 2.x to 3.0.0 (and removed
all deprecated things), searches like that don't work anymore:

test AND blue
test NOT blue
(test AND blue) OR red
etc.

Before 3.0.0, I was inserting my fields like this:

doc.add(new Field("content", sValues[j], Field.Store.YES,
Field.Index.TOKENIZED));

Now I do:

doc.add(new Field("content", sValues[j], Field.Store.YES,
Field.Index.ANALYZED));

My Index writer is opened like this:

writer = new IndexWriter(idx, new StandardAnalyzer(Version.LUCENE_CURRENT),
true, MaxFieldLength.UNLIMITED);

What is the equivalent of tokenized!?

Thanks,

- Mike
aka...@gmail.com

Reply via email to