Re: [Wikitech-l] Performance of parsing links?

2015-01-29 Thread Federico Leva (Nemo)
As of 
https://gerrit.wikimedia.org/r/#/c/29879/2/utils/MessageTable.php,cm , 
Linker::link took 20 KiB of memory per call. Cf. 
http://laxstrom.name/blag/2013/02/01/how-i-debug-performance-issues-in-mediawiki/
I don't know if such bugs/unfeatures and related best practices were 
written down somewhere.


Nemo

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Performance of parsing links?

2015-01-27 Thread Chad
On Tue Jan 27 2015 at 1:37:36 PM Brion Vibber bvib...@wikimedia.org wrote:

 Probably the fastest thing would be to manually create the ulli etc and
 wrap them around a loop calling the linker functions (Linker::link).

 https://doc.wikimedia.org/mediawiki-core/master/php/html/classLinker.html#
 a52523fb9f10737404b1dfa45bab61045


Another option could be using LinkBatch.

-Chad
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Performance of parsing links?

2015-01-27 Thread Brion Vibber
Probably the fastest thing would be to manually create the ulli etc and
wrap them around a loop calling the linker functions (Linker::link).

https://doc.wikimedia.org/mediawiki-core/master/php/html/classLinker.html#a52523fb9f10737404b1dfa45bab61045

-- brion

On Tue, Jan 27, 2015 at 1:33 PM, Daniel Barrett d...@cimpress.com wrote:

 I'm writing a parser function extension that outputs about 5000 lines of
 text (an organizational chart of a company) as a nested, bulleted list.

 * Bob the CEO
 ** Jane Jones
 ** Mike Smith
 *** etc.

 It takes about 3 seconds (real time) for MediaWiki to render this list,
 which is acceptable. However, if I make it a list of links, which is more
 useful:

 * [[User:Bob | Bob the CEO]]
 ** [[User:Jane | Jane Jones]]
 ** [[User:Mike | Mike Smith]]

 the rendering time more than doubles to 6-8 seconds, which users perceive
 as too slow.

 Is there a faster implementation for rendering a large number of links,
 rather than returning the wikitext list and having MediaWiki render it?

 Thanks,
 DanB

 
 My email address has changed to d...@cimpress.com. Please update your
 address book.

 Cimpress is the new name for Vistaprint NV, the world’s leader in mass
 customization. Read more about Cimpress at www.cimpress.com.
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l
___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Re: [Wikitech-l] Performance of parsing links?

2015-01-27 Thread Daniel Friesen
You should be able to return something like this to make your parser
function output raw HTML instead of WikiText.

return array( $output, 'noparse' = true, 'isHTML' = true );

~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/]

On 2015-01-27 1:33 PM, Daniel Barrett wrote:
 I'm writing a parser function extension that outputs about 5000 lines of text 
 (an organizational chart of a company) as a nested, bulleted list.

 * Bob the CEO
 ** Jane Jones
 ** Mike Smith
 *** etc.

 It takes about 3 seconds (real time) for MediaWiki to render this list, which 
 is acceptable. However, if I make it a list of links, which is more useful:

 * [[User:Bob | Bob the CEO]]
 ** [[User:Jane | Jane Jones]]
 ** [[User:Mike | Mike Smith]]

 the rendering time more than doubles to 6-8 seconds, which users perceive as 
 too slow.

 Is there a faster implementation for rendering a large number of links, 
 rather than returning the wikitext list and having MediaWiki render it?

 Thanks,
 DanB

 
 My email address has changed to d...@cimpress.com. Please update your address 
 book.

 Cimpress is the new name for Vistaprint NV, the world’s leader in mass 
 customization. Read more about Cimpress at www.cimpress.com.
 
 ___
 Wikitech-l mailing list
 Wikitech-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikitech-l

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

[Wikitech-l] Performance of parsing links?

2015-01-27 Thread Daniel Barrett
I'm writing a parser function extension that outputs about 5000 lines of text 
(an organizational chart of a company) as a nested, bulleted list.

* Bob the CEO
** Jane Jones
** Mike Smith
*** etc.

It takes about 3 seconds (real time) for MediaWiki to render this list, which 
is acceptable. However, if I make it a list of links, which is more useful:

* [[User:Bob | Bob the CEO]]
** [[User:Jane | Jane Jones]]
** [[User:Mike | Mike Smith]]

the rendering time more than doubles to 6-8 seconds, which users perceive as 
too slow.

Is there a faster implementation for rendering a large number of links, rather 
than returning the wikitext list and having MediaWiki render it?

Thanks,
DanB


My email address has changed to d...@cimpress.com. Please update your address 
book.

Cimpress is the new name for Vistaprint NV, the world’s leader in mass 
customization. Read more about Cimpress at www.cimpress.com.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l