Kelson already did it a few days ago.
On Mon, Nov 16, 2020 at 3:54 PM Travis Briggs wrote:
>
> Can someone mark mwoffliner as "in use", please?
>
> I lost my phone and now I can't sign in to mediawiki tech because of 2FA.
>
> Thanks,
> -Travis
>
> On Mon, Nov 2, 2020 at 9:09 AM Maximilian Doerr
Kimmo, while I can't directly answer your question on bottlenecks, I will
try and provide a little background information on existing issues for
those who are new (like myself!).
Here's a recent example of replication issues with the current setup:
Yes, I have a Toolforge account and there are a bunch of cronjobs that run
weekly (and a few that run daily).
The code can be found at
https://github.com/PersianWikipedia/fawikibot/tree/master/HujiBot where
stats.py is the program that actually connects to the DB, but weekly.py and
weekly-slow.py
Can someone mark mwoffliner as "in use", please?
I lost my phone and now I can't sign in to mediawiki tech because of 2FA.
Thanks,
-Travis
On Mon, Nov 2, 2020 at 9:09 AM Maximilian Doerr
wrote:
> Cyberbot will never be unclaimed. :-
>
> Cyberpower678
> English Wikipedia Administrator
>
Hey MA, I've checked, and while not explicitly disallowed, the fact that
this could work is more of an implementation detail that shouldn't really
be relied on.
The sections and where the instances are on them are organized to maintain
the service, and are not supposed to be depended on since
Hi Maarten,
I believe this work started many years ago, and it was paused, and recently
restarted because of the stability and performance problems in the last
years. Breaking changes are always painful, in this case of the replicas I
think the changes follow the recommendations laid out years
Moving the joins to the application layer definitely makes things quite
complex compared to an SQL query.
Having a data lake or other solutions like you mention makes it more
feasible to do these kinds of joins with big data, but it also usually
requires careful schema and index design when