Luceners,
            When a field is not tokenized should I replace every space
for a "?"?
 
I'm looking up for : my dear
 
If I test with luke, it splits the words in 'my' and 'dear'. So I can't
find in my not tokenized field. 
 
The same happens for
 
"my dear"
 
In these case I don't know why it splits in two words.
 
The only solution I found it was to replace " " for "?". It doesn't seem
to be the best choice.

Reply via email to