There's usually only a couple sort fields and a bunch of terms in the
various indices.  The terms are user entered on various media so the
number of terms is very large.

Thanks for the help.

Todd



On 10/29/08, Todd Benge <[EMAIL PROTECTED]> wrote:
> Hi,
>
> I'm the lead engineer for search on a large website using lucene for search.
>
> We're indexing about 300M documents in ~ 100 indices.  The indices add
>  up to ~ 60G.
>
> The indices are sorted into 4 different Multisearcher with the largest
> handling ~50G.
>
> The code is basically like the following:
>
> private static MultiSearcher searcher;
>
> public void init(File files) {
>
>      IndexSearcer [] searchers = new IndexSearcher[files.length] ();
>      int i = 0;
>      for ( File file: files ) {
>           searchers[i++] = new IndexSearcher(FSDirectory.getDirectory(file);
>      }
>
> searcher = new MultiSearcher(searchers);
> }
>
> public Searcher getSearcher() {
>    return searcher;
> }
>
> We're seeing a high cache rate with Term & TermInfo in Lucene 2.4.
> Performance is good but servers are consistently hanging with
> OutOfMemory errors.
>
> We're allocating 4G in the heap to each server.
>
> Is there any way to control the amount of memory Lucene consume for
> caching?  Any other suggestions on fixing the memory errors?
>
> Thanks,
>
> Todd
>

-- 
Sent from Gmail for mobile | mobile.google.com

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to