Hello,

> But it's now being claimed (one might assume, in defense of the
> new policy) that disallowing missing User-Agent strings is cutting
> 20-50% of the (presumably undesirable) load.  Which sounds pretty
> primary.  So which is it?

Check the CPU drop in Monday:
http://ganglia.wikimedia.org/pmtpa/graph.php?g=cpu_report&z=medium&c=Search&m=&r=week&s=descending&hc=4

Network drop on API: 
http://ganglia.wikimedia.org/pmtpa/graph.php?g=network_report&z=medium&c=API%20application%20servers&m=&r=week&s=descending&hc=4

etc.

You can sure assume, that we need to come up with something to "defend a new 
policy". 

> Presumably some percentage of that 20-50% will come back as the
> spammers realize they have to supply the string.  Presumably we
> then start playing whack-a-mole.

Yes, we will ban all IPs participating in this. 

> Presumably there's a plan for what to do when the spammers begin
> supplying a new, random string every time.

Random strings are easy to identify, fixed strings are easy to verify. 

> (I do worry about where this is going, though.)

Going where it always goes, proper operations of the website. Been there, done 
that. 

Domas
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to