CompoundFileReader question/'leaking' file descriptors ?

2006-02-12 Thread Paul Smith
I've been hunting an insidious problem whereby during heavy incremental indexing operations in production on redhat el3 machine I notice that the java process has a lot of open files which appear to be deleted. Now, before anyone jumps in, yes I know the # open file limit needs to be in

Re: CompoundFileReader question/'leaking' file descriptors ?

2006-02-13 Thread Doug Cutting
Paul Smith wrote: We're using Lucene 1.4.3, and after hunting around in the source code just to see what I might be missing, I came across this, and I'd just like some comments. Please try using a 1.9 build to see if this is something that's perhaps already been fixed. CompoundFileReader

Re: CompoundFileReader question/'leaking' file descriptors ?

2006-02-13 Thread Paul Smith
On 14/02/2006, at 7:44 AM, Doug Cutting wrote: Paul Smith wrote: We're using Lucene 1.4.3, and after hunting around in the source code just to see what I might be missing, I came across this, and I'd just like some comments. Please try using a 1.9 build to see if this is something that'

Re: CompoundFileReader question/'leaking' file descriptors ?

2006-02-13 Thread Paul Smith
No, all CSInputStream's share a single FSInputStream, so the FSInputStream shouldn't be closed until all of the CSInputStream's, have been closed. This is done by CompoundFileReader.close(). It sounds like that's what's not getting called. As you update indexes, how do you close stale

Re: CompoundFileReader question/'leaking' file descriptors ?

2006-02-13 Thread Doug Cutting
Paul Smith wrote: is 1.9 binary backward compatible? (both source code and index format). That is the intent. Try a nightly build: http://cvs.apache.org/dist/lucene/java/nightly/ Doug - To unsubscribe, e-mail: [EMAIL PROTEC