Hi guys,

Please rip this to shreds https://github.com/kaihendry/linkrot and
perhaps guide me to a better script. Something that can do the http
requests in parallel and hence much faster?

I ran it over sites/
for i in *; do test -d "$i" || continue; linkrot $i > $i.linkrot; done

and the output is over here:
http://s.natalian.org/2013-01-13/

000 means the domain didn't resolve. Definitely have some false
negatives, for e.g. on cat-v. I guess sites sometimes aren't working
and the failures need to be counted/recorded and when it hits a
threshold (e.g. 10 consecutive failures in 10 day daily check), only
then an admin needs to manually intervene?

Kind regards,

Reply via email to