Hi all, I am running a python script which parses nearly 22,000 html files locally stored using BeautifulSoup. The problem is the memory usage linearly increases as the files are being parsed. When the script has crossed parsing 200 files or so, it consumes all the available RAM and The CPU usage comes down to 0% (may be due to excessive paging).
We tried 'del soup_object' and used 'gc.collect()'. But, no improvement. Please guide me how to limit python's memory-usage or proper method for handling BeautifulSoup object in resource effective manner -- Yours, S.Selvam
-- http://mail.python.org/mailman/listinfo/python-list