> 1) I was trying 3.2 because I thought the searching multiple dbs would
> make this eaiser.

Fair enough. We only recommend using 3.2 if you're willing to live with
the beta nature of the releases and need to use a particular feature in
3.2. (e.g. collection support, phrases, regex limits, etc.)

> 2) Read the man page of htdig about the -m.  the url file should be urls
> delimted by space correct?  Does htdig recursively spider these urls?
> How does limit_urls_to and exclude_urls affect this?

No, indexing is only set to index one "hopcount," i.e. only those URLs. No
limits are applied to them--the limits are applied to links examined
during indexing.

> 3) I wasn't trying to run the concurrently, but just not to choke the
> system and let it deal w/ smaller chunks.

Sure. Just letting you know that if you do this, it'll take longer than
one "super long" run over the whole batch.

--
-Geoff Hutchison
Williams Students Online
http://wso.williams.edu/



-------------------------------------------------------
This sf.net email is sponsored by:ThinkGeek
Welcome to geek heaven.
http://thinkgeek.com/sf
_______________________________________________
htdig-general mailing list <[EMAIL PROTECTED]>
To unsubscribe, send a message to <[EMAIL PROTECTED]> with a 
subject of unsubscribe
FAQ: http://htdig.sourceforge.net/FAQ.html

Reply via email to