Hi,

Lots of little things to look at here.
You should do lsof as root, and it looks like you aren't doing that.
You should double-check Tomcat's maxThreads param in server.xml.
You should give Jetty a try.
I don't think you said anything about looking at the container's or solr logs 
and finding errors.


Otis 
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch



----- Original Message ----
> From: vivek sar <vivex...@gmail.com>
> To: solr-user@lucene.apache.org
> Sent: Wednesday, April 15, 2009 7:28:57 PM
> Subject: Re: Question on StreamingUpdateSolrServer
> 
> Thanks Otis.
> 
> I did increase the number of file descriptors to 22K, but I still get
> this problem. I've noticed following so far,
> 
> 1) As soon as I get to around 1140 index segments (this is total over
> multiple cores) I start seeing this problem.
> 2) When the problem starts occassionally the index request
> (solrserver.commit) also fails with the following error,
>       java.net.SocketException: Connection reset
> 3) Whenever the commit fails, I'm able to access Solr by the browser
> (http://ets11.co.com/solr). If the commit is succssfull and going on I
> get blank page on Firefox. Even the telnet to 8080 fails with
> "Connection closed by foreign host."
> 
> It does seem like there is some resource issue as it happens only once
> we reach a breaking point (too many index segment files) - lsof at
> this point usually shows at 1400, but my ulimit is much higher than
> that.
> 
> I already use compound format for index files. I can also run optimize
> occassionally (though not preferred as it blocks the whole index cycle
> for a long time). I do want to find out what resource limitation is
> causing this and it has to do something with when Indexer is
> committing the records where there are large number of segment files.
> 
> Any other ideas?
> 
> Thanks,
> -vivek
> 
> On Wed, Apr 15, 2009 at 3:10 PM, Otis Gospodnetic
> wrote:
> >
> > One more thing.  I don't think this was mentioned, but you can:
> > - optimize your indices
> > - use compound index format
> >
> > That will lower the number of open file handles.
> >
> >  Otis
> > --
> > Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
> >
> >
> >
> > ----- Original Message ----
> >> From: vivek sar 
> >> To: solr-user@lucene.apache.org
> >> Sent: Friday, April 10, 2009 5:59:37 PM
> >> Subject: Re: Question on StreamingUpdateSolrServer
> >>
> >> I also noticed that the Solr app has over 6000 file handles open -
> >>
> >>     "lsof | grep solr | wc -l"   - shows 6455
> >>
> >> I've 10 cores (using multi-core) managed by the same Solr instance. As
> >> soon as start up the Tomcat the open file count goes up to 6400.  Few
> >> questions,
> >>
> >> 1) Why is Solr holding on to all the segments from all the cores - is
> >> it because of auto-warmer?
> >> 2) How can I reduce the open file count?
> >> 3) Is there a way to stop the auto-warmer?
> >> 4) Could this be related to "Tomcat returning blank page for every 
> >> request"?
> >>
> >> Any ideas?
> >>
> >> Thanks,
> >> -vivek
> >>
> >> On Fri, Apr 10, 2009 at 1:48 PM, vivek sar wrote:
> >> > Hi,
> >> >
> >> >  I was using CommonsHttpSolrServer for indexing, but having two
> >> > threads writing (10K batches) at the same time was throwing,
> >> >
> >> >  "ProtocolException: Unbuffered entity enclosing request can not be 
> repeated.
> >> "
> >> >
> >> > I switched to StreamingUpdateSolrServer (using addBeans) and I don't
> >> > see the problem anymore. The speed is very fast - getting around
> >> > 25k/sec (single thread), but I'm facing another problem. When the
> >> > indexer using StreamingUpdateSolrServer is running I'm not able to
> >> > send any url request from browser to Solr web app. I just get blank
> >> > page. I can't even get to the admin interface. I'm also not able to
> >> > shutdown the Tomcat running the Solr webapp when the Indexer is
> >> > running. I've to first stop the Indexer app and then stop the Tomcat.
> >> > I don't have this problem when using CommonsHttpSolrServer.
> >> >
> >> > Here is how I'm creating it,
> >> >
> >> > server = new StreamingUpdateSolrServer(url, 1000,3);
> >> >
> >> > I simply call server.addBeans(...) on it. Is there anything else I
> >> > need to do to make use of StreamingUpdateSolrServer? Why does Tomcat
> >> > become unresponsive  when Indexer using StreamingUpdateSolrServer is
> >> > running (though, indexing happens fine)?
> >> >
> >> > Thanks,
> >> > -vivek
> >> >
> >
> >

Reply via email to