Exactly, Java will only "free" memory when it needs to. And if you set a maximum heap, under most circumstances of "heavy load" you will reach the max before Java attempts to free anything. This is done for performance reasons.

There are options for the garbage collector that control how often it will attempt to run, how aggressive it is, and if it will give memory back to the OS.

You should review the Sun java documentation on the garbage collector options.


On Jul 3, 2006, at 1:08 PM, Bruno Vieira wrote:

I really got no OOM.

So, the impression I have is that there is some kind of cache static who uses the free memory for the IndexSearcher, if I set -Xmx16m the application uses the entire 16m, if a set -Xmx512m after some time the application uses the entire 512m, the way that even if a instanciate a new IndexSearcher this
cache is not cleaned.

Am i right ? If yes, is there some parameter or method where i can free or
control that ?

Thanks

2006/7/3, robert engels <[EMAIL PROTECTED]>:

Did you try what was suggested? (-Xmx16m) and did you get an OOM? If
not, there is no memory leak.

On Jul 3, 2006, at 12:33 PM, Bruno Vieira wrote:

> Thanks for the answer, but I have isolated the cycle inside a loop
> on a
> static void main (String args[]) Class to test this issue.In this
> case there
> were no classes referencing the IndexSercher and the problem still
> happened.
>
>
> 2006/7/3, robert engels <[EMAIL PROTECTED]>:
>>
>> You may not have a memory leak at all. It could just be garbage
>> waiting to be collected. I am fairly certain there are no "memory
>> leaks" in the current Lucene code base (outside of the ThreadLocal
>> issue).
>>
>> A simple way to verify this would be to add -Xmx16m on the command
>> line. If there were a memory leak then it will eventually fail with
>> an OOM.
>>
>> If there is a memory leak, then it is probably because your code is
>> holding on to IndexReader references in some static var or map.
>>
>>
>> On Jul 3, 2006, at 9:43 AM, Bruno Vieira wrote:
>>
>> > Hi everyone,
>> >
>> > I am working on a project with around 35000 documents (8 text
>> > fields with
>> > 256 chars at most for each field) on lucene. But unfortunately this
>> > index is
>> > updated at every moment and I need that these new items be in the
>> > results of
>> > my search as fast as possible.
>> >
>> > I have an IndexSearcher, then I do a search getting the last 10
>> > results with
>> > ordering by a name field and the memory allocated is 13mb, I
>> close the
>> > IndexSearcher because the lucene database was updated by and
>> external
>> > application and I create a new IndexSearcher, do the same search
>> again
>> > wanting to get the last 10 results with ordering by a name field
>> > and the
>> > memory allocated is 15mb. At every time I do this cycle the memory
>> > increase
>> > in 2mb, so in a moment I have a memory leak.
>> >
>> > If the database is not updated and i do not create a new
>> > IndexSearcher i can
>> > do searches forever without memory leak.
>> >
>> > Why when I close an IndexSearcher (indexSearcher.close();
>> > indexSearcher =
>> > new IndexSearcher("/database/") ;)after some searches with ordering
>> > and open
>> > a new one the memory is not free ?
>> >
>> > Thanks to any suggestions.
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: [EMAIL PROTECTED]
>> For additional commands, e-mail: [EMAIL PROTECTED]
>>
>>


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]




---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to