Tim Starling wrote:
>> Another goal beyond editing itself is normalizing the world of 'alternate
>> parsers'. There've been several announced recently, and we've got such a
>> large array now of them available, all a little different. We even use mwlib
>> ourselves in the PDF/ODF export deployment, and while we don't maintain that
>> engine we need to coordinate a little with the people who do so that new
>> extensions and structures get handled.
> 
> I know that there is a camp of data reusers who like to write their
> own parsers. I think there are more people who have written a wikitext
> parser from scratch than have contributed even a small change to the
> MediaWiki core parser. They have a lot of influence, because they go
> to conferences and ask for things face-to-face.
> 
> Now that we have HipHop support, we have the ability to turn
> MediaWiki's core parser into a fast, reusable library. The performance
> reasons for limiting the amount of abstraction in the core parser will
> disappear. How many wikitext parsers does the world really need?

I realize you have a dry wit, but I imagine this joke was lost on nearly
everyone. You're not really suggesting that everyone who wants to parse
MediaWiki wikitext compile and run HipHop PHP in order to do so.

It's unambiguously a fundamental goal that content on Wikimedia wikis be
able to be easily redistributed, shared, and spread. A wikisyntax that's
impossible to adequately parse in other environments (or in Wikimedia's
environment, for that matter) is a critical and serious inhibitor to this
goal.

MZMcBride



_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to