Actually.... I think there may be something wrong here. BaseTokenizerFactory does not make a Tokenizer, it creates a TokenStream, so it should never be cast to Tokenizer
My custom TokenizerFactory now looks the same as: o.a.s.analysis.PatternTokenizerFactory Not sure what to look at next... ideas? thanks ryan On Fri, Aug 21, 2009 at 10:13 AM, Ryan McKinley<ryan...@gmail.com> wrote: > Just updated to /trunk and am now seeing this exception: > > Caused by: org.apache.solr.client.solrj.SolrServerException: > java.lang.ClassCastException: > xxx.solr.analysis.JSONKeyValueTokenizerFactory$1 cannot be cast to > org.apache.lucene.analysis.Tokenizer > at > org.apache.solr.client.solrj.embedded.EmbeddedSolrServer.request(EmbeddedSolrServer.java:141) > ... 15 more > Caused by: java.lang.ClassCastException: > xxx.solr.analysis.JSONKeyValueTokenizerFactory$1 cannot be cast to > org.apache.lucene.analysis.Tokenizer > at > org.apache.solr.analysis.TokenizerChain.getStream(TokenizerChain.java:69) > at > org.apache.solr.analysis.SolrAnalyzer.reusableTokenStream(SolrAnalyzer.java:74) > at > org.apache.solr.schema.IndexSchema$SolrIndexAnalyzer.reusableTokenStream(IndexSchema.java:364) > at > org.apache.lucene.index.DocInverterPerField.processFields(DocInverterPerField.java:124) > at > org.apache.lucene.index.DocFieldProcessorPerThread.processDocument(DocFieldProcessorPerThread.java:244) > at > org.apache.lucene.index.DocumentsWriter.updateDocument(DocumentsWriter.java:772) > > > Looks like SolrIndexAnalyzer now assumes everything uses the new > TokenStream API... > > I'm fine upgrading, but it seems we should the 'back compatibility' > notice more explicit. > > > FYI, this is what the TokenizerFactory looks like: > > public class JSONKeyValueTokenizerFactory extends BaseTokenizerFactory > { > ... > > public TokenStream create(Reader input) { > final JSONParser js = new JSONParser( input ); > final Stack<String> keystack = new Stack<String>(); > > return new TokenStream() > { > ... >