Pretty much any Wiki can be modified to use captchas and, of course,
more importantly, to not use the default param names.
Other tricks like adding in false submits in hidden divs (so users
don't see them), etc, that submit to honeypots that then block the
source are also very feasible.

One of the inherent dangers in using canned sites is that a spambot
only needs to know how to deal with that code.
Changing it up a bit on them confuses them. I have not seen a spambot
yet that had adaptive capabilities. This is why sites written in 100%
custom code are immune until they get big enough (like Facebook, etc)
to warrant spambots specifically targeted at them.

A documentation wiki, especially for something as esoteric as building
GUI apps in Perl, is not likely to ever get the sort of traffic that
would inspire a lazy spambot coder to specifically target it.

Even big documentation sites like the MySQL dev site, PHP's site, etc
aren't really big enough to attract that kind of attention, since
spammers are a: just trying to saturate the places where everyone goes
and b: if they even have a single functioning synapse they might catch
on that we're too intelligent to be their target market, which leaves
their only purpose being SEO by introducing linkbacks that Google
reads in -- and even that is facing Google's and Yahoo's own
optimisations and stupidity-filters getting stronger.

It doesnt' need to be impossible. It just needs to require work--any
amount of work at all--to dissuade them from something that doesn't
provide any potential bang for their lifting a lazy finger.

-- 
Dodger

Reply via email to