We currently use HtDig. As part of the nightly process to build the database for the search engine, HtDig also spits out a report of broken links on our internal and external websites. The biggest
flaw with this program is that sometimes the broken link reports fail to
indicate on which page the broken link resides, which makes it pretty useless.
Here is the way it reports a broken link when it's working right:
Not found: http://www.mbari.org/data/mapping/loihi.htm
Ref: http://www.mbari.org/data/mapping/hawaii/references.htm


And here's what we get when it isn't working right (with nothing in the "ref"
field so we don't know what page contains the broken link):
Not found: http://www.mbari.org/rd/search.htm
Ref:
I'm looking for a solution that does not require me to scan our entire website
daily, but rather an automated process.


_________________________________________________________________
Get fast, reliable Internet access with MSN 9 Dial-up � now 3 months FREE! http://join.msn.click-url.com/go/onm00200361ave/direct/01/




-------------------------------------------------------
This SF.Net email is sponsored by the new InstallShield X.
From Windows to Linux, servers to mobile, InstallShield X is the one
installation-authoring solution that does it all. Learn more and
evaluate today! http://www.installshield.com/Dev2Dev/0504
_______________________________________________
ht://Dig general mailing list: <[EMAIL PROTECTED]>
ht://Dig FAQ: http://htdig.sourceforge.net/FAQ.html
List information (subscribe/unsubscribe, etc.)
https://lists.sourceforge.net/lists/listinfo/htdig-general

Reply via email to