Hi,

please take a look in the attached snmp diagrams.
At 1625 I started fetching,
At 17555 i had created a new segment and restart the fetcher with much more links.


Do you think it is normal that after ca. 70.000 pages nutch use > 1GB memory for the fetcher?
What kind of information the fetcher collects over the time?
Make it sense to write this data to disk, since we haven't a IO performance problem until fetching, isn't it?

<<inline: real_mem-day.png>>


<<inline: swap_mem-day.png>>


--------------------------------------------------------------- enterprise information technology consulting open technology: http://www.media-style.com open source: http://www.weta-group.net open discussion: http://www.text-mining.org

Reply via email to