Rick Welykochy wrote:
> Since domains are case insensitive ...
>
> if ( $ENV{HTTP_REFERER} !~ /example.com/i )
> --^
>
Good point.
> But then again, why not put this into .htaccess or httpd.conf:
>
> Order allow,deny
> Allow from all
> Deny fro
Sonam Chauhan wrote:
> sub handler
> {
> if ( $ENV{HTTP_REFERER} !~ /example.com/ )
> { return DECLINED; } #ok... go to next handler
Since domains are case insensitive ...
if ( $ENV{HTTP_REFERER} !~ /example.com/i )
--^
But then agai
TECTED] [mailto:[EMAIL PROTECTED]]On Behalf Of
> [EMAIL PROTECTED]
> Sent: Monday, 4 December 2000 9:39 AM
> To: [EMAIL PROTECTED]
> Subject: [SLUG] Deny http access to a domain
>
>
> Where specifically deny http access to traffic
> refered from example.com
>
> Kev
> > It could be done through a perl script...
> > but there is probably a better way
>
> This is exactly what I need to do.
If you're using Apache, you can write a simple Apache handler
to do a fast check of HTTP_REFERER on each page served
off your website.
I don't know your setup but with mod
On Mon, 4 Dec 2000 [EMAIL PROTECTED] wrote:
> Daniel Finn wrote:
>
> > or do you want to deny all traffic that refered by a link from a website
> > (example.com) to your website?
> > It could be done through a perl script...
> > but there is probably a better way
>
> This is exactly what I need
Daniel Finn wrote:
> or do you want to deny all traffic that refered by a link from a website
> (example.com) to your website?
> It could be done through a perl script...
> but there is probably a better way
This is exactly what I need to do.
kevin
--
SLUG - Sydney Linux User Group Mailing L
TED]>
To: <[EMAIL PROTECTED]>
Sent: Monday, December 04, 2000 11:37 AM
Subject: Re: [SLUG] Deny http access to a domain
> Rick Welykochy wrote:
> >
> > [EMAIL PROTECTED] wrote:
> >
> > > Where specifically deny http access to traffic
> >
[EMAIL PROTECTED] wrote:
> Sorry, my bad.
> I wish to deny all http traffic referred
> from example.com
Not sure if it fits your bill, but occassionally I blacklist a C class,
in total, at the firewall for a certain time because because their
search engine ignores robots.txt. I very rarely have
Rachel Polanskis wrote:
>
> I think Rick is trying to say "Please speak in a language other than Geek".
> I had to read your question at least twice before I understood the context.
>
Sorry, my bad.
I wish to deny all http traffic referred
from example.com
kevin
--
SLUG - Sydney Linux User
On Mon, Dec 04, 2000 at 09:39:18AM +1100, [EMAIL PROTECTED] wrote:
> Where specifically deny http access to traffic
> refered from example.com
If I understand you mean people clicking on links at example.com, not people
surfing from the example.com server? It can certainly be done, a lot of
free
Rick Welykochy wrote:
>
> [EMAIL PROTECTED] wrote:
>
> > Where specifically deny http access to traffic
> > refered from example.com
>
> Surely. Of course.
>
???
Kevin
--
SLUG - Sydney Linux User Group Mailing List - http://slug.org.au/
More Info: http://slug.org.au/lists/listinfo/slug
[EMAIL PROTECTED] wrote:
> Where specifically deny http access to traffic
> refered from example.com
Surely. Of course.
--
Rick Welykochy || Praxis Services Pty Limited
"Tired of being a crash test dummy for Microsoft? Try Linux"
--
SLUG - Sydney Linux User Group Mailing List - http://slug
Where specifically deny http access to traffic
refered from example.com
Kevin
--
SLUG - Sydney Linux User Group Mailing List - http://slug.org.au/
More Info: http://slug.org.au/lists/listinfo/slug
13 matches
Mail list logo