I was wondering if anybody else that has been using updateDocument
noticed it uses a large amount of memory when updating an existing document.
For example, when using updateDocument on an empty Lucene directory, the
resulting 12K documents creates a 3MB index, the amount of memory the
program uses is 270MB. When the program is executed again, this time
updating all 12K documents with the same exact same data, the program
will take all available memory allocated to the JVM. The largest I used
was 1024MB. No out of memory errors occurred but the memory also was
not released after closing all the readers and writers.
I was extremely surprised by this. By running the program multiple
times making sure that the code path does not change when deleting the
Lucene directory to effectively add verses update. I get very
consistent results. Adding is very stable at 270MB, but updating
existing documents maxes out the JVM memory allocation.
Is there a configuration option that can be used to adjust the memory
usage?
I have tried doing separate delete/add code and get similar results so
it appears to be more of a Lucene delete document issue?
Any help would be greatly appreciated.
Thanks,
Kris
---------------------------------------------------------------------
To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org
For additional commands, e-mail: java-user-h...@lucene.apache.org