Although for a different use case, I find myself facing a related problem with 
my bot on Commons.

https://commons.wikimedia.org/wiki/Commons:Auto-protected_files/wikipedia/zh

Those pages get updated periodically whenever a commons image starts or ends 
being used on a Main Page.
Aside from the fact that this in particular could be a native feature[1], this 
page will get a lot of revisions.

Now also aside is that these old revisions are quite useless. Even if someone 
would make an edit in between, the bot will overwrite it. But I don't care so 
much for the wasted space, since it is relatively small waste.

The problem comes in when these pages need maintenance. I find myself 
periodically checking up on my bot generated pages to make sure I move them to 
/Archive_# and delete those and start clean. Because when the revision count 
reaches 2,500, it can no longer be deleted because of the limit we implemented.

So to keep them mobile and usable, I always fight the limit by moving it to a 
subpage (without a redirect) before it reaches that limit and delete it there 
and start a new page on the old name.

Would be nice if these kind of bot-generated pages could avoid that. And if in 
doing so we saving revision space, that's nice.

Other examples:
* https://commons.wikimedia.org/wiki/Commons:Database_reports (and subpages)
* https://commons.wikimedia.org/wiki/Commons:Auto-protected_files (and subpages)

-- Krinkle


_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to