Hey guys, great job with solr. I've been using a similar server-wrapped-around-lucene setup for a few years now and am migrating over to solr. My big issue is that I need to maintain 4 different search indices with different schema, only moving one over for now but I like to think ahead, so here are my options:

1) Run 4 different instances of an app server, each with a different -Dsolr.solr.home - not really optimal

2) Run 4 copies of the webapp inside a single appserver instance and configure each home separately with JNDI - haven't tried this yet, does that work?

3)  Extend solr to handle multiple indices

I looked at the code and it looks like I could extend SolrCore to maintain a hashmap of solr cores and schemas under the following setup:

-define multiple schema in schema.xml with <schema name="indexName">
-Pass over information about which index you want to search as an additional CGI parameter for searches and as an XML attr in the <add>, <delete>, <optimize>, <commit> tags, e.g. <add indexName="desiredIndex"><doc...><doc..></add> or <optimize indexName="desiredIndex" /> -- if not specified, operations will take place on the default index -change or subclass handlers to call SolrMultiCore.getCore(String indexName) and SolrMultiCore.getSchema(String indexname)


Is this something the community would be interested in? I left off replication because I'm not using it and haven't dug deep enough to understand it, assuming I disable replication, is there anything I'm missing? So far I'm planning on writing SolrMultiCore (basically a carbon copy of SolrCore except maintaining a hashmap of instances instead of a singleton) and making a solr.handler.multi package with subclassed versions of all the handlers which use SolrMultiCore, then configuring those handlers into Solr in place of the existing ones.

Any pitfalls I'm missing? Suggestions? Would you guys like the code after I'm done?

Reply via email to