Re: [Wikitech-l] Distinguishing normal categories from categories added through transcluded templates

2011-07-14 Thread Paul Copperman
2011/7/14 Leo Koppelkamm :
> ( I could do something like
> $templatesCache = $holders->parent->mTplExpandCache;
> foreach ( $templatesCache as $template ) {
> $strp = strpos($template, $line);
> if ( $strp ) break;
> }
>
> inside parser->replaceInternalLinks2, L1980, but that's pretty hacky & not 
> really stable either )
>
$parser->mTplExpandCache holds only the text of template calls without
parameters ( eg. {{Foo}} but not {{bar|x|y}} ), so it won't help you
much. You could do a separate parse of the article text while skipping
the preprocessor. This should give you only the categories that are
linked directly from the page. Of course for normal page views this
would probably be too expensive, so maybe an API module that you call
via AJAX?

HTH,

Paul Copperman

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-05 Thread Paul Copperman
2011/4/5 Magnus Manske :
> So is the time spent with the actual expansion (replacing variables),
> or getting the wikitext for n-depth template recursion? Or is it the
> parser functions?
>
Well, getting the wikitext shouldn't be very expensive as it is cached
in several cache layers. Basically it's just expanding many, many
preprocessor nodes. A while ago I did a bit of testing with my
template tool on dewiki[1] and found that wikimedia servers spend
approx. 0.2 ms per expanded node part, although there's of course much
variation depending on current load. My tool counts 303,905 nodes when
expanding [[Barack Obama]] so that would account for about 60 s of
render time. As already said, YMMV.

Paul Copperman

[1] <http://de.wikipedia.org/wiki/Benutzer:P.Copp/scripts/templateutil.js>,
you can test it with http://de.wikipedia.org/w/index.php?action=raw&title=Benutzer:P.Copp/scripts/templateutil.js&ctype=text/javascript')>
and a click on "Template tools" in the toolbox

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] HipHop

2011-04-05 Thread Paul Copperman
2011/4/5 Magnus Manske :
> For comparison: WYSIFTW parses [[Barak Obama]] in 3.5 sec on my iMac,
> and in 4.4 sec on my MacBook (both Chrome 12).
>
> Yes, it doesn't do template/variable replacing, and it's probably full
> of corner cases that break; OTOH, it's JavaScript running in a
> browser, which should make it much slower than a dedicated server
> setup running precompiled PHP.
>

Seriously, the bulk of the time needed to parse these enwiki articles
is for template expansion. If you pre-expand them, taking care that
also the templates in ... tags get expanded, MediaWiki can
parse the article in a few seconds, 3-4 on my laptop.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] CSS/javascript injection for AJAX requests

2010-01-08 Thread Paul Copperman
The styles and js are already available in the parser output in
->mHeadItems. Should be trivial to expose them through the API via
action=parse.
So I've put this on bugzilla, see


P.Copp

On Fri, Jan 8, 2010 at 5:42 PM, Carl (CBM)  wrote:
> I noticed today that livepreview does not pick up the
> dynamically-generated CSS from the SyntaxHighlight_Geshi extension.
> The same problem occurs in liquidthreads: when you add a comment with
> a Geshi call in it, the CSS will not be picked up when the comment is
> initially saved. The first full reload of the page will pick up the
> css correctly neither case.
>
> After some investigation, this is really an issue in core and will
> apply to any extension that needs to add CSS and/or javascript to the
> output HTML.  To fix the bugs with livepreview, we would need some
> mechanism where AJAX calls receive not only new HTML, but also new CSS
> and/or javascript, and can add that CSS and javascript to the current
> page without a reload.  Adding the CSS and javascript dynamically may
> be tricky from a compatibility standpoint, but having library
> functions in our site javascript would help with that.
>
> I have not investigated the cause of the problem in liquidthreads.
>
> The code in EditPage.php shows scars from similar problems, in a
> commented-out call to send a list of categories back to an AJAX
> preview request.
>
> - Carl
>
> ___
> Wikitech-l mailing list
> Wikitech-l@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikitech-l
>

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l


Re: [Wikitech-l] category page with curious data

2009-02-16 Thread Paul Copperman
On Mon, Feb 16, 2009  at 4:41 PM, Aryeh Gregor
 wrote:
> On Mon, Feb 16, 2009 at 9:38 AM, Uwe Baumbach  wrote:
>> after our upgrade to 1.14 we see at one category page:
>>
>> http://wiki-de.genealogy.net/Kategorie:Stiftung_Stoye/Band_42_(Genealogische_Nachl%C3%A4sse)
>>
>> that this cat should have 618 pages. But browsing through the pages we
>> can see: they are only 351.
>> This correct number is shown by our own mini extension that queries table
>> "categorylinks"...
>
> Running maintenance/populateCategory.php --force should fix this.  It
> will refresh *all* category table counts, however, not just one.
> There's currently no nice mechanism to force a category size recount,
> other than removing enough entries to get it below 200 and then
> re-adding them.  There's probably a bug open for this somewhere.

BTW: There's also a bug open about category counters not being updated on
article deletion, see https://bugzilla.wikimedia.org/show_bug.cgi?id=17155.

___
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l