Going off on a slight tangent, is it possible to limit the number of
requests per second per remote ip? Similar to how mod_cband will limit
the number of requests per vhost/user but limit them for remote users?
Thanks
Ben
In case anyone else runs into the same problem, it turned out that a
co
In case anyone else runs into the same problem, it turned out that a
convenient fix was to use mod_evasive, which will temporarily firewall
ips based on number of TCP connections. The same Chinese sites are still
downloading material, but now in an orderly and manageable way :-)
Graham
Hamilt
Unfortunately connlimit is missing from both debian and ubuntu at the
moment:
https://bugs.launchpad.net/ubuntu/+source/linux-source-2.6.20/+bug/60439/+activity
Shame, it looked like that was going to be such a neat way to fix the
problem...
Graham
Hamilton Vera wrote:
It is just a target
It is just a target name
$IPTABLES -N logdropdos
$IPTABLES -A logdropdos -j LOG --log-level INFO --log-prefix "[logdropdos]"
$IPTABLES -A logdropdos -j DROP
Just to make easier the log analisys, you can also use
"-j DROP" instead.
Hamilton Vera
int Administrator (char Network[],char Comput
Hamilton Vera wrote:
You can try to use iptables, to limit the number of TCP connections
$IPTABLES -A INPUT -p TCP -i $WAN -s 0/0 --syn --dport 80 -m connlimit
--connlimit-above 10 -j logdropdos
Sounds good. What's the 'logdropdos'? I don't seem to have it, and
google gives me nothing. Is
You can try to use iptables, to limit the number of TCP connections
$IPTABLES -A INPUT -p TCP -i $WAN -s 0/0 --syn --dport 80 -m connlimit
--connlimit-above 10 -j logdropdos
Or implement a Freebsd firewall with QoS, applying shapes to parallel TCP
connections.
I hope this help.
On Thu, 21
Hi,
I've just become involved with a system running apache2.0.55 on ubuntu
with linux 2.6.17.
The system is currently unable to run due to repeated downloads of a
large number of pdfs by systems located in China. These are hogging all
sockets and eventually causing apache to die (I'm appendi