> Would an appropriate /robots.txt help things out?
>

Doesn't look like it. The "guilty" hosts never attempted to download 
robots.txt files. Bots like Google do request those files and behave 
properly but those aren't the ones causing issues or dowloading 
duplicate files. Nor do they show up as "wget/x.yy" user agents in 
Apache logs.

Gerard
-- 
http://linuxfromscratch.org/mailman/listinfo/lfs-dev
FAQ: http://www.linuxfromscratch.org/faq/
Unsubscribe: See the above information page

Reply via email to