Berthold Cogel wrote:


Ok! It works for wordlist.work. My current run of htdig just reached 5.5 GB without crashing. I didn't use a bad words list to get a real big file as fast as possible. htdig hasn't finished yet and has touched about 180000 documents.

Our wordlist.work reached about 8 GB!



Whether the compiler flags work for the db code or not, I will see during the next hours.



htmerge crashed as soon as the words.db reached 4 GB. This seems to be somehow a limit for the BerkleyDB that comes with htdig-3.1.6.


I got repeated messages like this:

DB2 problem...: Unable to allocate 1071 bytes from mpool shared region: Not enough memory available.


Perhaps I can get around this with a proper bad words list.



Berthold



-------------------------------------------------------
This SF.Net email is sponsored by: Oracle 10g
Get certified on the hottest thing ever to hit the market... Oracle 10g. Take an Oracle 10g class now, and we'll give you the exam FREE. http://ads.osdn.com/?ad_id=3149&alloc_id=8166&op=click
_______________________________________________
ht://Dig general mailing list: <[EMAIL PROTECTED]>
ht://Dig FAQ: http://htdig.sourceforge.net/FAQ.html
List information (subscribe/unsubscribe, etc.)
https://lists.sourceforge.net/lists/listinfo/htdig-general

Reply via email to