Re: [Wikitech-l] Performance of parsing links?

2015-01-29 Thread Federico Leva (Nemo)
As of https://gerrit.wikimedia.org/r/#/c/29879/2/utils/MessageTable.php,cm , Linker::link took 20 KiB of memory per call. Cf. http://laxstrom.name/blag/2013/02/01/how-i-debug-performance-issues-in-mediawiki/ I don't know if such bugs/unfeatures and related best practices were written down

Re: [Wikitech-l] Performance of parsing links?

2015-01-27 Thread Chad
On Tue Jan 27 2015 at 1:37:36 PM Brion Vibber bvib...@wikimedia.org wrote: Probably the fastest thing would be to manually create the ulli etc and wrap them around a loop calling the linker functions (Linker::link). https://doc.wikimedia.org/mediawiki-core/master/php/html/classLinker.html#

Re: [Wikitech-l] Performance of parsing links?

2015-01-27 Thread Brion Vibber
Probably the fastest thing would be to manually create the ulli etc and wrap them around a loop calling the linker functions (Linker::link). https://doc.wikimedia.org/mediawiki-core/master/php/html/classLinker.html#a52523fb9f10737404b1dfa45bab61045 -- brion On Tue, Jan 27, 2015 at 1:33 PM,

Re: [Wikitech-l] Performance of parsing links?

2015-01-27 Thread Daniel Friesen
You should be able to return something like this to make your parser function output raw HTML instead of WikiText. return array( $output, 'noparse' = true, 'isHTML' = true ); ~Daniel Friesen (Dantman, Nadir-Seen-Fire) [http://danielfriesen.name/] On 2015-01-27 1:33 PM, Daniel Barrett wrote:

[Wikitech-l] Performance of parsing links?

2015-01-27 Thread Daniel Barrett
I'm writing a parser function extension that outputs about 5000 lines of text (an organizational chart of a company) as a nested, bulleted list. * Bob the CEO ** Jane Jones ** Mike Smith *** etc. It takes about 3 seconds (real time) for MediaWiki to render this list, which is acceptable.