At 18:19 -0500 1/4/11, Mark Montague wrote:
>Follow the example below, but use only the user agent condition, omit the IP 
>condition, and suitably adjust the RewriteRule regular expression to match the 
>URL(s) you wish to block:
>
>http://httpd.apache.org/docs/2.2/rewrite/rewrite_guide.html#blocking-of-robots
>
>Note that wget has a -U option that can be used to get around this block by 
>using a user agent string that you are not blocking -- so the block will not 
>prevent a determined downloader.

*******

You might want to have a look at this rather new mailing list.  It's interested 
in doing exactly the opposite of what you want. 

List-Id: webscrapers talk <webscrapers.cool.haxx.se>
List-Archive: <http://cool.haxx.se/pipermail/webscrapers>
List-Post: <mailto:webscrap...@cool.haxx.se>
List-Help: <mailto:webscrapers-requ...@cool.haxx.se?subject=help>
List-Subscribe: <http://cool.haxx.se/cgi-bin/mailman/listinfo/webscrapers>, 
<mailto:webscrapers-requ...@cool.haxx.se?subject=subscribe>



-- 

--> From the U S of A, the only socialist country that refuses to admit it. <--

---------------------------------------------------------------------
The official User-To-User support forum of the Apache HTTP Server Project.
See <URL:http://httpd.apache.org/userslist.html> for more info.
To unsubscribe, e-mail: users-unsubscr...@httpd.apache.org
   "   from the digest: users-digest-unsubscr...@httpd.apache.org
For additional commands, e-mail: users-h...@httpd.apache.org

Reply via email to