RE: Lock obtain timeout

2007-01-15 Thread Stephanie Belton
Thanks for that - I have made the following changes:

- optimize more often
- omitNorms on all non-fulltext fields
- useCompoundfFile=true (will keep an eye on performance)

And that seems to have solved the problem.

-Original Message-
From: Chris Hostetter [mailto:[EMAIL PROTECTED] 
Sent: 13 January 2007 01:37
To: solr-user@lucene.apache.org
Subject: Re: Lock obtain timeout



: Are the two problems related? Looking through the mailing list it seems
: that changing the settings for useCompoundFile from false to true could
: help but before I do that I would like to understand if there are
: undesirable side effects, what isn’t this param set to true by
: default?

Too Many Open Files can result from lots of different possible reasons:
one is that you have soo many indexed fields with norms that the number of
files in your index is too big -- that's the use case where
useCompoundFile=true can help you -- but it's not set that way be default
because it can make searching slower.

the other reason why you can have too many open files is if you are
getting more concurrent requests then you can handle -- or if the clients
initiating those requests aren't closing them properly (sockets count as
files too)

understanding why you are getting these errors requires that you look at
what your hard and soft file limits are (ulimit -aH and ulimit -aS on my
system) and what files are in use by Solr when these errors occur (lsof -p
_solrpid_).

to answer your earlier question, i *think* you may be getting the lock
timeout errors because it can't access the lock file, because it can't
open any more files ... i'm not 100% sure.




-Hoss





Re: Lock obtain timeout

2007-01-12 Thread Chris Hostetter


: Are the two problems related? Looking through the mailing list it seems
: that changing the settings for useCompoundFile from false to true could
: help but before I do that I would like to understand if there are
: undesirable side effects, what isn’t this param set to true by
: default?

Too Many Open Files can result from lots of different possible reasons:
one is that you have soo many indexed fields with norms that the number of
files in your index is too big -- that's the use case where
useCompoundFile=true can help you -- but it's not set that way be default
because it can make searching slower.

the other reason why you can have too many open files is if you are
getting more concurrent requests then you can handle -- or if the clients
initiating those requests aren't closing them properly (sockets count as
files too)

understanding why you are getting these errors requires that you look at
what your hard and soft file limits are (ulimit -aH and ulimit -aS on my
system) and what files are in use by Solr when these errors occur (lsof -p
_solrpid_).

to answer your earlier question, i *think* you may be getting the lock
timeout errors because it can't access the lock file, because it can't
open any more files ... i'm not 100% sure.




-Hoss