https://bugzilla.wikimedia.org/show_bug.cgi?id=57026

--- Comment #25 from Brad Jorsch <bjor...@wikimedia.org> ---
(In reply to comment #21)
> I think there's still double rendering going on. On commons, a null edit
> takes
> just slightly more than double the time a page preview takes. On my local
> copy,
> the operations take roughly the same time (As they should).

I redid the same sort of simulation as I did in comment 10 (anyone with access
to terbium, feel free to look at /home/anomie/eval-bug57026.txt), and it seems
there is no longer double parsing going on before the text is saved to the
database on enwiki. I also checked Commons with simulating a null edit on a
random page, and again I see no double parsing.

Doing a non-null edit on Commons, I see abuse filter rule 87's use of
added_links is triggering reparses (it seems to be broken such that it runs
added_links for all actions). While it could probably use at least the same as
in Gerrit change 95481 (and probably some checking to see if it's going to be
breaking the cache by passing the old wikitext for some vars), I submitted
Gerrit change 101224 to be better about handling a null format in core.

Also, there might still be a second parse from WikiPage::doEditUpdates() if the
page uses magic words that access information about the current revision or if
the page transcludes itself. I don't see any way around that.

And I'm not entirely sure that just timing a null edit isn't going to cause a
parser cache miss for the view-after-save, particularly if your preferences
don't match the defaults. Do you still see double timing if you do the null
edit via the API?

-- 
You are receiving this mail because:
You are the assignee for the bug.
You are on the CC list for the bug.
_______________________________________________
Wikibugs-l mailing list
Wikibugs-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikibugs-l

Reply via email to