I think that I finally have a clue as to why those empty pages were
returned.

I have perlbal as front, and it was set to maintain persistent connections
with the apache backend listening on localhost.

I also have some configuration of apache that would essentially deny access
to certain user agents that were known to abuse my site before, such as
wget, RapidDownloader, ia_archiver, and a few more. (I know that wget can be
run with diff. user agents. The people who run wget against my site usually
are clueless. If they had enough clue to change user agent, they would also
realize that my site is dynamic and very deep. )

So, I am conjecturing, what happened was that perlbal selected some worker
for a persistent connection, and the first user agent to connect was one of
those "bad guys". Then the worker would reject all subsequent queries coming
on the same TCP connection, which would have the unfortunate effect that all
queries were rejected.

Since the time that I disabled persistent connections (which should not
matter too much on localhost), I have never had this problem where my server
would start returning empty pages.

I also verified with ab, that I have not had any performance hit due to
that.

The stock Ubuntu Hardy mod_perl is solid as a rock, now.

I want to thank everyone. It was a tough one because regular stress testing
would not trigger it.

As for those spiders, some of them are sort of legitimate, like ia_archiver
(which I thought was a rogue bot at some point as it would not provide a
webpage), but their expense in terms of traffic is not worth the benefit
that I get from them.

Igor

Reply via email to