Accidentally replied directly instead of to list like i meant to

On Thu, Feb 20, 2020 at 8:15 AM bawolff <bawolff...@gmail.com> wrote:

> Some back of the napkin math
>
> If it takes 0.5 seconds to parse a page on average, it would take 289 days
> to refresh all the pages on wikipedia (Assuming we aren't parallelizing the
> task). It definitely seems like a non-trivial amount of computer work.
>
> See also discussion at https://phabricator.wikimedia.org/T157670
> (Basically the proposal is, instead of trying to purge everything in
> alphabetical order, just start purging things that haven't been purged in
> like a year or something. Discussion is fairly old, I don't think anyone is
> working on it anymore).
>
> --
> bawolff
>
> On Thu, Feb 20, 2020 at 7:50 AM Amir E. Aharoni <
> amir.ahar...@mail.huji.ac.il> wrote:
>
>>
>> ‫בתאריך יום ה׳, 20 בפבר׳ 2020 ב-9:26 מאת ‪bawolff‬‏ <‪
>> bawolff...@gmail.com‬‏>:‬
>>
>>> Pretty sure the answer is no (Although i don't know for a fact).
>>>
>>> However, parser cache only lasts for 30 days. So pages will get parsed at
>>> least once every 30 days (if viewed). However that's separate from links
>>> update (aka categories, linter, etc).
>>>
>>> I suspect that doing a linksupdate of every article once a month would
>>> take
>>> more than a month.
>>>
>>
>> If it's true, and it may well be, is it conceivable that this will be
>> some kind of a continuous process?
>>
>>
>
_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to