Hello,

> I just committed one!  This was really already there, in
> SegmentsReader,
> but it was not public and needed a few minor changes.  Enjoy.
> 
> Doug

Great....thanks! Do you feel as though, managing an index made up of numerous smaller 
indices is an effective use of the MultiReader and MultiSearcher? Ignoring for a 
moment the potential of causing a "Too many open files" error, I feel as though it may 
be decent/reasonable way in which one could ensure overall index integrity by managing 
smaller parts.

As a side note, regarding the "Too many open files" issue, has anyone noticed that 
this could be related to the JVM? For instance, I have a coworker who tried to run a 
number of "optimized" indexes in a JVM instance and a received the "Too many open 
files" error. With the same number of available file descriptors (on linux ulimit = 
ulimited), he split the number of indicies over too JVM instances his problem 
disappeared.  He also tested the problem by increasing the available memory to the JVM 
instance, via the -Xmx parameter, with all indicies running in one JVM instance and 
again the problem disappeared. I think the issue deserves more testing to pin-point 
the exact problem, but I was just wondering if anyone has already experienced anything 
similar or if this information could be of use to anyone, in which case we should 
probably start a new thread dedicated to this issue.


Regards,
Rasik

 


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to