I've just rescued a looping 3.1.5 run which hit a server that delivered an
indefinite tree following spurious link to "index.html/" - the sysadmin
added a temporary firewall block for the server concerned thus saving me a
reindex of 200,000 documents!
So what? Oh, yes, I have a hopcount limit set but this loop was not going
to hit that quickly since it was looping on a fairly broad front. I can't
see an existing option for what I'm seeking, but I've looked and failed to
see before, so excuse me if this already exists:
It would seem to be useful as a safety net limit to have a MaxDocID
configuration option - I know I expect to index 220,000 documents +-20,000
and could usefully set it to 300,000.
I'm not sure if there is any merit in also suggesting a MaxParsedDocs to
limit the first number in the -v output (MaxDocID would cap the second
number in the -v output). I guess a cap on either one would provide the
safety net that I'm looking for.
regards,
Malcolm.
[EMAIL PROTECTED] http://users.ox.ac.uk/~malcolm/
_______________________________________________
htdig-general mailing list <[EMAIL PROTECTED]>
To unsubscribe, send a message to <[EMAIL PROTECTED]> with a
subject of unsubscribe
FAQ: http://htdig.sourceforge.net/FAQ.html