Yonik Seeley wrote:
The high-level description of the new API looks good (being able to
add arbitrary properties to tokens), unfortunately, I've never had the
time to try and use it and give any constructive feedback.
As far as difficulty of use, I assume this only applies to
implementing your own TokenFilter? It seems like most standard users
would be just stringing together existing TokenFilters to create
custom Analyzers?
-Yonik
http://www.lucidimagination.com
True - its the implementation. And just trying to understand whats going
on the first time you see it.
Its not particularly difficult, but its also not obvious like the
previous API was. As a user, I would ask why that is so, and frankly the
answer wouldn't do much for me (as a user).
I don't know if most 'standard' users implement their own or not. I will
say, and perhaps I was in a special situation, I was writing them and
modifying them almost as soon
as I started playing with Lucene. And even when I wasnt, I needed to
understand the code to understand some of the complexities that could
occur, and thankfully, that was breezy to do.
Right now, if you told me to go convert all of Solr to the new API you
would hear a mighty groan.
As Lucene's contrib hasn't been fully converted either (and its been
quite some time now), someone has probably heard that groan before.
--
- Mark
http://www.lucidimagination.com
---------------------------------------------------------------------
To unsubscribe, e-mail: java-dev-unsubscr...@lucene.apache.org
For additional commands, e-mail: java-dev-h...@lucene.apache.org