I have a newbie question on what is the best way to batch add/commit a large
collection of document data via solrj.  My first attempt  was to write a
multi-threaded application that did following.

Collection<SolrInputDocument> docs = new ArrayList<SolrInputDocument>();
for (Widget w : widges) {
    doc.addField("id", w.getId());
    doc.addField("name", w.getName());
   doc.addField("price", w.getPrice());
    doc.addField("category", w.getCat());
    doc.addField("srcType", w.getSrcType());
    docs.add(doc);

    // commit docs to solr server
    server.add(docs);
    server.commit();
}

And I got this exception.

rg.apache.solr.common.SolrException:
Error_opening_new_searcher_exceeded_limit_of_maxWarmingSearchers2_try_again_later

Error_opening_new_searcher_exceeded_limit_of_maxWarmingSearchers2_try_again_later

        at 
org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:424)
        at 
org.apache.solr.client.solrj.impl.CommonsHttpSolrServer.request(CommonsHttpSolrServer.java:243)
        at 
org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:105)
        at org.apache.solr.client.solrj.SolrServer.commit(SolrServer.java:86)

The solrj wiki/documents seemed to indicate that because multiple threads
were calling SolrServer.commit() which in term called
CommonsHttpSolrServer.request() resulting in multiple searchers.  My first
thought was to change the configs for autowarming.  But after looking at the
autowarm params, I am not sure what can be changed or perhaps a different
approach is recommened.

    <filterCache
      class="solr.FastLRUCache"
      size="512"
      initialSize="512"
      autowarmCount="0"/>

    <queryResultCache
      class="solr.LRUCache"
      size="512"
      initialSize="512"
      autowarmCount="0"/>

    <documentCache
      class="solr.LRUCache"
      size="512"
      initialSize="512"
      autowarmCount="0"/>

Your help is much appreciated.

Reply via email to