Maybe it's not a leak, Monique. :) If you use sorting in Lucene, then the FieldCache object will keep some data permanently in memory, for example.
Otis ---- Sematext :: http://sematext.com/ :: Solr - Lucene - Nutch Hadoop ecosystem search :: http://search-hadoop.com/ ----- Original Message ---- > From: Monique Monteiro <monique.lou...@gmail.com> > To: java-user@lucene.apache.org > Sent: Fri, March 5, 2010 1:38:31 PM > Subject: OutOfMemoryError > > Hi all, > > > > I’m new to Lucene and I’m evaluating it in a web application which looks > up strings in a huge index – the index file contains 32GB. I keep a > reference to a Searcher object during the application’s lifetime, but this > object has strong memory requirements and keeps memory consumption around > 950MB. I did some optimization in order to share some fields in two > “composed” indices, but in a web application with less than 1GB for JVM, > OutOfMemoryError is generated. It seems that the searcher keeps some form of > cache which is not frequently released. > > > I’d like to know if this kind of memory leak is normal according to > Lucene’s behaviour and if the only available solution is adding memory to > the JVM. > > Thanks in advance! > > -- > Monique Monteiro, MSc > IBM OOAD / SCJP / MCTS Web > Blog: http://moniquelouise.spaces.live.com/ > Twitter: http://twitter.com/monilouise > MSN: monique_lou...@msn.com > GTalk: monique.lou...@gmail.com --------------------------------------------------------------------- To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org For additional commands, e-mail: java-user-h...@lucene.apache.org