Hi, does Lucene keep the complete index in memory ? As stated before the result index is 50MB, this would correlate with the memory footprint required by Lucene as seen in my app: jvm 120MB - 50MB(Lucene) - 50MB(my App) = something left jvm 100MB - 50MB(Lucene) - 50MB(my App) = OOError
though some hint, whether this is the case, from the programming side would be appreciated ... Stefan -----Ursprüngliche Nachricht----- Von: Sudarsan, Sithu D. [mailto:sithu.sudar...@fda.hhs.gov] Gesendet: Mi 24.06.2009 16:18 An: java-user@lucene.apache.org Betreff: RE: OutOfMemoryError using IndexWriter When the segments are merged, but not optimized. It happened at 1.8GB to our program, and now we develop and test in Win32 but run the code on Linux, which seems to be handling atleast upto 3GB of index. In fact, if the index size if beyond 1.8GB even, Luke throws Java Heap Error, if I try to open. Please post your results/views. Sincerely, Sithu -----Original Message----- From: stefan [mailto:ste...@intermediate.de] Sent: Wednesday, June 24, 2009 10:08 AM To: java-user@lucene.apache.org Subject: AW: OutOfMemoryError using IndexWriter Hi, I do use Win32. What do you mean by "the index file before optimizations crosses your jvm memory usage settings (if say 512MB)" ? Could you please further explain this ? Stefan -----Ursprüngliche Nachricht----- Von: Sudarsan, Sithu D. [mailto:sithu.sudar...@fda.hhs.gov] Gesendet: Mi 24.06.2009 15:55 An: java-user@lucene.apache.org Betreff: RE: OutOfMemoryError using IndexWriter Hi Stefan, Are you using Windows 32 bit? If so, sometimes, if the index file before optimizations crosses your jvm memory usage settings (if say 512MB), there is a possibility of this happening. Increase JVM memory settings if that is the case. Sincerely, Sithu D Sudarsan Off: 301-796-2587 sithu.sudar...@fda.hhs.gov sdsudar...@ualr.edu -----Original Message----- From: stefan [mailto:ste...@intermediate.de] Sent: Wednesday, June 24, 2009 4:09 AM To: java-user@lucene.apache.org Subject: OutOfMemoryError using IndexWriter Hi, I am using Lucene 2.4.1 to index a database with less than a million records. The resulting index is about 50MB in size. I keep getting an OutOfMemory Error if I re-use the same IndexWriter to index the complete database. This is though recommended in the performance hints. What I now do is, every 10000 Objects I close the index (and every 50 close actions optimize it) and create a new IndexWriter to continue. This process works fine, but to me seems hardly the recommended way to go. I've been using jhat/jmap as well as Netbeans profiler and am fairly sure that this is a problem related to Lucene. Any Ideas - or post this to Jira ? Jira has quite a few OutOfMemory postings but they all seem closed in Version 2.4.1. Thanks, Stefan --------------------------------------------------------------------- To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org For additional commands, e-mail: java-user-h...@lucene.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org For additional commands, e-mail: java-user-h...@lucene.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org For additional commands, e-mail: java-user-h...@lucene.apache.org
--------------------------------------------------------------------- To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org For additional commands, e-mail: java-user-h...@lucene.apache.org