On Fri, 2 Aug 2002, Nick Arnett wrote:
Anyone here figured out what Yahoo will tolerate in terms of spidering its
message header pages before it blocks the robot's IP address? Before I
start testing, I figured I'd see if anyone else here has already done so.
The duration of the block seems to lengthen, so testing could take a while.
Sure would be nice if they'd just say what they consider acceptable...
This reminded me of the denial of service attacks that hit them (and
others) maybe a year and a half ago. If I recall, it seems I read that
their routers (or firewalls) were upgraded/configured to stop numerous
connections.
Maybe the spider can be slowed way down (and behave like a normal human
browsing).
Jeremy C. Reed
echo 'G014AE824B0-07CC?/JJFFFI?D64CBD=3C427=;6HI2J' |
tr /-_ :\ Sc-y./ | sed swxw`uname`w
___
Robots mailing list
[EMAIL PROTECTED]
http://www.mccmedia.com/mailman/listinfo/robots