On Sat, 15 Nov 2003 23:52:31 -0500, you wrote:

>Recently, a 'user' attempted to access a restricted area of my site 
>repetitively (spanning five hours) entering the same url repetitively 
>[probably by script]. A massive log file was generated. I would like to ban 
>such behavior by limiting the number of successive 'get's a user can do (say 
>4 attempts) before an appropriate action is taken..
>
>As a temporary measure (until I can figure a better way) the url in question 
>was disabled.
>
>What I'd like to do, on a per-file basis using $_SESSION, is a combination of 
>ipaddress perhaps with a counter that records the number of times that file 
>was accessed, and limit the number of successive 'get's that can be done 
>before the file is no longer accessible.

Sessions won't work unless the script at the other end is co-operating by
holding state for you, which is unlikely.

Blocking is never an ideal solution - automated processing can catch
innocent users, a DDoS can knock most sites over, IP bans can be got around
by abusing open proxies, and assholes seem to have infinite free time. It's
rarely worth getting into an arms race with them because ultimately, if
someone wants your machine off the net badly enough, there's nothing you can
do about it.

Having said that, the earlier in the chain you block malicious traffic the
better. ISP > router > firewall > web server > script. I think maybe a
script that adds a firewall rule when triggered would be effective in your
case (a quick Google will probably find something like this). Just bear in
mind that it's not foolproof.

You could also try blocking if a referer: is lacking... but again it's not
really reliable.

-- 
PHP General Mailing List (http://www.php.net/)
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to