Problem is, there is no way to force a gc. Runtime.gc() only requests
that a gc be performed - if the CPU is stressed you will not get a GC.
On Jul 3, 2006, at 12:50 PM, Chuck Williams wrote:
I'd suggest forcing gc after each n iteration(s) of your loop to
eliminate the garbage factor. Also, you can run a profiler to see
which
objects are leaking (e.g., the netbeans profiler is excellent). Those
steps should identify any issues quickly.
Chuck
robert engels wrote on 07/03/2006 07:40 AM:
Did you try what was suggested? (-Xmx16m) and did you get an OOM? If
not, there is no memory leak.
On Jul 3, 2006, at 12:33 PM, Bruno Vieira wrote:
Thanks for the answer, but I have isolated the cycle inside a
loop on a
static void main (String args[]) Class to test this issue.In this
case there
were no classes referencing the IndexSercher and the problem still
happened.
2006/7/3, robert engels <[EMAIL PROTECTED]>:
You may not have a memory leak at all. It could just be garbage
waiting to be collected. I am fairly certain there are no "memory
leaks" in the current Lucene code base (outside of the ThreadLocal
issue).
A simple way to verify this would be to add -Xmx16m on the command
line. If there were a memory leak then it will eventually fail with
an OOM.
If there is a memory leak, then it is probably because your code is
holding on to IndexReader references in some static var or map.
On Jul 3, 2006, at 9:43 AM, Bruno Vieira wrote:
Hi everyone,
I am working on a project with around 35000 documents (8 text
fields with
256 chars at most for each field) on lucene. But unfortunately
this
index is
updated at every moment and I need that these new items be in the
results of
my search as fast as possible.
I have an IndexSearcher, then I do a search getting the last 10
results with
ordering by a name field and the memory allocated is 13mb, I close
the
IndexSearcher because the lucene database was updated by and
external
application and I create a new IndexSearcher, do the same search
again
wanting to get the last 10 results with ordering by a name field
and the
memory allocated is 15mb. At every time I do this cycle the memory
increase
in 2mb, so in a moment I have a memory leak.
If the database is not updated and i do not create a new
IndexSearcher i can
do searches forever without memory leak.
Why when I close an IndexSearcher (indexSearcher.close();
indexSearcher =
new IndexSearcher("/database/") ;)after some searches with
ordering
and open
a new one the memory is not free ?
Thanks to any suggestions.
-------------------------------------------------------------------
--
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]