On Tue, Sep 7, 2010 at 1:19 PM, Ken Chase k...@sizone.org wrote:
So i guess im new at internets as my colleagues told me because I havent gone
around to 30-40 systems I control (minus customer self-managed gear) and
installed a restrictive robots.txt everywhere to make the web less useful to
I *am* curious--what makes it any worse for a search engine like Google
to fetch the file than any other random user on the Internet
Possibly because that other user is who the customer pays have their
content delivered to?
Bruce Williams
Possibly because that other user is who the customer pays have their content
delivered to?
Customers don't want to deliver their content to search engines? That seems
silly.
http://www.last.fm/robots.txt (Note the final 3 disallow lines...)
Customers don't want to deliver their content to search engines? That seems
silly.
Got me there! :-)
Bruce Williams
On Wed, 08 Sep 2010 02:21:31 PDT, Bruce Williams said:
I *am* curious--what makes it any worse for a search engine like Google
to fetch the file than any other random user on the Internet
Possibly because that other user is who the customer pays have their
content delivered to?
Seems to
On Wed, Sep 08, 2010 at 12:04:07AM -0700, Matthew Petach said:
I *am* curious--what makes it any worse for a search engine like Google
to fetch the file than any other random user on the Internet? In either
case,
the machine doing the fetch isn't going to rate-limit the fetch, so
you're
On Wed, Sep 8, 2010 at 9:20 AM, Ken Chase k...@sizone.org wrote:
On Wed, Sep 08, 2010 at 12:04:07AM -0700, Matthew Petach said:
I *am* curious--what makes it any worse for a search engine like Google
to fetch the file than any other random user on the Internet? In either
case,
the
So i guess im new at internets as my colleagues told me because I havent gone
around to 30-40 systems I control (minus customer self-managed gear) and
installed a restrictive robots.txt everywhere to make the web less useful to
everyone.
Does that really mean that a big outfit like yahoo should
That speed doesn't seem too bad to me - robots.txt is our friend when
one had bandwidth limitations.
Leslie
On 9/7/10 1:19 PM, Ken Chase wrote:
So i guess im new at internets as my colleagues told me because I havent gone
around to 30-40 systems I control (minus customer self-managed gear)
On Tue, Sep 07, 2010 at 04:19:58PM -0400, Ken Chase wrote:
This makes it look like Yahoo is actually trafficking in pirated software, but
that's kinda too funny to expect to be true, unless some yahoo tech decided to
use that IP/server @yahoo for his nefarious activity, but there are better
10 matches
Mail list logo