Hi,

I have around 12 millions objects in my PostgreSQL database to be indexed.
I'm running a thread to fetch the rows from the database. The thread will
also create the documents and put it in an indexing queue. While this is
happening my main process will retrieve the documents from the queue and
will index it in the size of 1000. For some time the process is running as
expected, but after some time, I get an exception.

*[corePostProcess] org.apache.solr.client.solrj.SolrServerException:
IOException occured when talking to server at:
http://localhost:8983/solr/mine-search
<http://localhost:8983/solr/intermine-search>…………………………….…………………………….[corePostProcess]
Caused by: java.net.SocketException: Broken pipe (Write
failed)[corePostProcess]    at
java.net.SocketOutputStream.socketWrite0(Native Method)*


I tried increasing the batch size upto 30000. Then I got a different
exception.

*[corePostProcess] org.apache.solr.client.solrj.SolrServerException:
IOException occured when talking to server at:
http://localhost:8983/solr/mine-search
<http://localhost:8983/solr/mine-search>……………………………………………….…………………………………………….[corePostProcess]
Caused by: org.apache.http.NoHttpResponseException: localhost:8983 failed
to respond*


I would like to know whether there are any good practices on handling such
situation, such as max no of documents to index in one attempt etc.

My environement :

Version : solr 7.2, solrj 7.2
Ubuntu 16.04
RAM 20GB
I started Solr in standalone mode.
Number of replicas and shards : 1

The method I used :
                UpdateResponse response = solrClient.add(solrDocumentList);
                solrClient.commit();


Thanks in advance.

Arunan

Reply via email to