Ok, this question might be quite basic ... but I couldn't find anything in
the FAQ that addressed it specifically.
I'm in the process of seting up a web archive of a number of mailing lists
I run, with htdig as the search engine.
When I ran 'rundig' against the archive to create the initial database it
took quite a long time (~6 hours on a P2 300 & 64mb of RAM.
When I update the archive & re-run the 'dig', it seems to take a long time
also.
I am running it with the '-a' flag, so it doesn't blow away the current
files ... but I was wondering if there is a way I can make the database
file creation run faster?
Would more memory effect the processing significantly? Maybe more
disk? (got 6gb now). Is there a mode of operation where it only looks at
changed pages for indexing?
Thanks!
david
--
| Internet: [EMAIL PROTECTED]
| WWW: http://www.midrange.com/david
|
| This message was written and delivered using 100%
| post-consumer (recycled) data bits.
------------------------------------
To unsubscribe from the htdig mailing list, send a message to
[EMAIL PROTECTED]
You will receive a message to confirm this.