bd808 added a comment.

  In T399415#10999020 <https://phabricator.wikimedia.org/T399415#10999020>, 
@Xqt wrote:
  
  > In T399415#10998648 <https://phabricator.wikimedia.org/T399415#10998648>, 
@bd808 wrote:
  >
  >> I guess my first question is if these tests could run from Wikimedia 
infrastructure rather than GitHub Actions.
  >
  > We could probably use **self-hosted runners** on WMF infrastructure:
  
  That might be possible. One of the challenges would be finding folks to 
monitor and keep these runners working. This is probably not an impossible 
challenge, but t won't be trivial either.
  
  > To port these tests to Jenkins looks much more difficult to me and I have 
no idea if and how this would be possible.
  
  Moving to tests run by zuul + jenkins would probably be possible, but also 
annoying at the current moment. The #zuul-upgrade 
<https://phabricator.wikimedia.org/project/view/7592/> project is working 
towards changing a lot of things in that CI pipeline so the work would likely 
turn out to need an initial implementation and then a follow up project to move 
from Jenkins Job Builder described tests to the ansible replacement.
  
  Yet another option might be figuring out how to mirror the pywikibot code to 
gitlab.wikimedia.org and then using the self-service CI pipelines there to run 
your tests. We currently have both locally hosted and externally hosted 
gitlab-runners. We do not however have windows or macOS runners which are 
things I see at least a few of the pywikibot GitHub Actions using.
  
  > Blocking IP cannot be a long-term solution and you also have to ask 
yourself what to to if other sites than beta are affected. So there should be 
any bypass mechanism for trusted CI traffic through headers or tokens or 
maxlagish throttling. But you know that better than I do.
  
  IP blocking is likely here to stay. We are fundamentally having the same 
problem as production wikis trying to block LTA type vandals. The compounding 
issue here is that it not just edit traffic that is causing us problems, but 
read traffic as well. The production wikis are having the same core problem 
with aggressive scraper bots 
(https://diff.wikimedia.org/2025/04/01/how-crawlers-impact-the-operations-of-the-wikimedia-projects/),
 but they are taken care of by more people and are also getting a focus project 
to work on adding more automated traffic management. Unfortunately I have 
doubts that much of that work will be applicable to the beta cluster wikis due 
to staffing and technology constraints.

TASK DETAIL
  https://phabricator.wikimedia.org/T399415

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

_______________________________________________
pywikibot-bugs mailing list -- [email protected]
To unsubscribe send an email to [email protected]

Reply via email to