On Wed, Jan 5, 2011 at 8:07 PM, Alex Brollo <alex.bro...@gmail.com> wrote:
> Browsing the html code of source pages, I found this statement into a html
> comment:
>
> *Expensive parser function count: 0/500*
>
> I'd like to use this statement to evaluate "lightness" of a page, mainly
> testing the expensiveness of templates into the page but: in your opinion,
> given that the best would be a 0/500 value, what are limits for a good,
> moderately complex, complex page, just to have a try to work about? What is
> a really alarming value that needs fast fixing?

A really alarming value that needs fast fixing would be, approximately
speaking, 501 or higher.  That's why the maximum is there.  We don't
leave fixing this kind of thing to users.

> And - wouldn't a good idea to display - just with a very small mark or
> string into a corner of the page - this datum into the page, allowing a fast
> feedback?

No.  It's only meant for debugging when you run over the limit and the
page stops working.  It can help you track down why the page isn't
working, and isolate the templates that are causing the problem.  The
same goes for the other limits.

If you want to detect whether a page is rendering too slowly, just try
action=purge and see how long it takes.  If it takes more than a few
seconds, you probably want to improve it, because that's how long it
will take to render for a lot of logged-in users (parser cache hides
this if you have default preferences, including for all anons).  We're
forced to use artificial metrics when imposing automatic limits on
page rendering only because the time it takes to parse a page isn't
reliable, and using it as an automatic limit would make parsing
non-deterministic.  For manual inspection, you should just use time to
parse, not any artificial metrics.

_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to