Re: yahoo crawlers hammering us

2010-09-08 Thread Matthew Petach
On Tue, Sep 7, 2010 at 1:19 PM, Ken Chase k...@sizone.org wrote: So i guess im new at internets as my colleagues told me because I havent gone around to 30-40 systems I control (minus customer self-managed gear) and installed a restrictive robots.txt everywhere to make the web less useful to

Re: yahoo crawlers hammering us

2010-09-08 Thread Bruce Williams
I *am* curious--what makes it any worse for a search engine like Google to fetch the file than any other random user on the Internet Possibly because that other user is who the customer pays have their content delivered to? Bruce Williams

RE: yahoo crawlers hammering us

2010-09-08 Thread Nathan Eisenberg
Possibly because that other user is who the customer pays have their content delivered to? Customers don't want to deliver their content to search engines? That seems silly. http://www.last.fm/robots.txt (Note the final 3 disallow lines...)

Re: yahoo crawlers hammering us

2010-09-08 Thread Bruce Williams
Customers don't want to deliver their content to search engines?  That seems silly. Got me there! :-) Bruce Williams

Re: yahoo crawlers hammering us

2010-09-08 Thread Valdis . Kletnieks
On Wed, 08 Sep 2010 02:21:31 PDT, Bruce Williams said: I *am* curious--what makes it any worse for a search engine like Google to fetch the file than any other random user on the Internet Possibly because that other user is who the customer pays have their content delivered to? Seems to

Re: yahoo crawlers hammering us

2010-09-08 Thread Ken Chase
On Wed, Sep 08, 2010 at 12:04:07AM -0700, Matthew Petach said: I *am* curious--what makes it any worse for a search engine like Google to fetch the file than any other random user on the Internet? In either case, the machine doing the fetch isn't going to rate-limit the fetch, so you're

Re: yahoo crawlers hammering us

2010-09-08 Thread Matthew Petach
On Wed, Sep 8, 2010 at 9:20 AM, Ken Chase k...@sizone.org wrote: On Wed, Sep 08, 2010 at 12:04:07AM -0700, Matthew Petach said:  I *am* curious--what makes it any worse for a search engine like Google  to fetch the file than any other random user on the Internet?  In either case,  the

yahoo crawlers hammering us

2010-09-07 Thread Ken Chase
So i guess im new at internets as my colleagues told me because I havent gone around to 30-40 systems I control (minus customer self-managed gear) and installed a restrictive robots.txt everywhere to make the web less useful to everyone. Does that really mean that a big outfit like yahoo should

Re: yahoo crawlers hammering us

2010-09-07 Thread Leslie
That speed doesn't seem too bad to me - robots.txt is our friend when one had bandwidth limitations. Leslie On 9/7/10 1:19 PM, Ken Chase wrote: So i guess im new at internets as my colleagues told me because I havent gone around to 30-40 systems I control (minus customer self-managed gear)

Re: yahoo crawlers hammering us

2010-09-07 Thread Harry Strongburg
On Tue, Sep 07, 2010 at 04:19:58PM -0400, Ken Chase wrote: This makes it look like Yahoo is actually trafficking in pirated software, but that's kinda too funny to expect to be true, unless some yahoo tech decided to use that IP/server @yahoo for his nefarious activity, but there are better