Qun,

Are you using blockUntilFinished() and/or shutdown()?

One of the things to note is that a commit is just another "document," so
writing a commit into the queue of the ConcurrentUpdateSolrServer isn't
enough to get it flushed out.


Michael Della Bitta

Applications Developer

o: +1 646 532 3062  | c: +1 917 477 7906

appinions inc.

“The Science of Influence Marketing”

18 East 41st Street

New York, NY 10017

t: @appinions <https://twitter.com/Appinions> | g+:
plus.google.com/appinions
w: appinions.com <http://www.appinions.com/>


On Thu, Jun 27, 2013 at 10:21 AM, qungg <qzheng1...@gmail.com> wrote:

> Hi,
>
> I'm using concurrentUpdateSolrServer to do my incremental indexing nightly.
> I have 50 shards to index into, about 10,000 documents each night. I start
> one concurrentUpdateSolrServer on each shards and start to send documents.
> The queue size for concurrentUpdateSolrServer is 100, and 4 threads. At the
> end of the import, i will send commit using the same
> concurrentUpdateSolrServer. The problem is some of the
> concurrentUpdateSolrServer is not sending the commit to the shards and the
> import task hangs for a couple hours.
>
> So I looked at the log and find out that the shards received about 1000
> document couple hours later following with a commit. Is there anything
> methods I can call to flush out documents before I send the commit? Or are
> there any existing issue related to concurrentUpdateSolrServer related to
> this?
>
> Thanks,
> Qun
>
>
>
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/ConcurrentUpdateSolrServer-hanging-tp4073620.html
> Sent from the Solr - User mailing list archive at Nabble.com.
>

Reply via email to