Have a 300K URLs segement to fetch (no parsing)
I see memory continuously growing up... looking like a memory leak.
I have patch 769, 770 installed, and did not see any other patches related
to memory leak.
--
-MilleBii-
Does it run out of memory? Is GC able to reclaim consumed heap space?
Have a 300K URLs segement to fetch (no parsing)
I see memory continuously growing up... looking like a memory leak.
I have patch 769, 770 installed, and did not see any other patches related
to memory leak.
How many threads do you have running concurrently?
Is there any log output to indicate any warnings or errors otherswise?
On Sat, Jul 2, 2011 at 7:40 AM, Markus Jelsma markus.jel...@openindex.iowrote:
Does it run out of memory? Is GC able to reclaim consumed heap space?
Have a 300K URLs
---BeginMessage---
Here's a snippet from our wiki, these settings dump GC logs to a file which you
can inspect.
-verbose:gc
enables verbose logging of the garbage collectors
-XX:+PrintGCTimeStamps
enables printing of timestamps in the garbage collector logs
-XX:+PrintGCDetails
4 matches
Mail list logo