Limiting the number of rows only handles one attack. The one I mentioned,
fetching one page deep in the result set, caused a big issue on prod at
our site. We needed to limit the max for "start" as well as "rows".

It is possible to make it safe, but a lot of work. We did this for
Ultraseek. I would always, always front it with Apache, to get some
of Apache's protection.

wunder

On 11/17/08 6:04 AM, "Erik Hatcher" <[EMAIL PROTECTED]> wrote:
> 
> On Nov 16, 2008, at 6:55 PM, Walter Underwood wrote:
>> Limiting the maximum number of rows doesn't work, because
>> they can request rows 20000-20100. --wunder
> 
> But you could limit how many rows could be returned in a single
> request... that'd close off one DoS mechanism.
> 
> Erik


Reply via email to