You can store them and then use different analyzer chains on it (stored,
doesn't need to be indexed)
I'd probably use the collector pattern
se.search(new MatchAllDocsQuery(), new Collector() {
private AtomicReader reader;
private int i = 0;
@Override
public boolean a
: I want to use Solr for an academical research. One step of my purpose is I
: want to store tokens in a file (I will store it at a database later) and I
you could absolutely write a java program which access the analyzers
directly nad does whatever you want with the results of analysing a piece
Hello!
Take a look at custom posting formats. For example
here is a nice post showing what you can do with Lucene SimpleText
codec:
http://blog.mikemccandless.com/2010/10/lucenes-simpletext-codec.html
However please remember that it is not advised to use that codec in
production environmen
Hi;
I want to use Solr for an academical research. One step of my purpose is I
want to store tokens in a file (I will store it at a database later) and I
don't want to index them. For such kind of purposes should I use core
Lucene or Solr? Is there an example for writing a custom analyzer and just