Chad wrote:
> On Tue, May 3, 2011 at 2:15 PM, MZMcBride <z...@mzmcbride.com> wrote:
>> I realize you have a dry wit, but I imagine this joke was lost on nearly
>> everyone. You're not really suggesting that everyone who wants to parse
>> MediaWiki wikitext compile and run HipHop PHP in order to do so.
> 
> And how is using the parser with HipHop going to be any more
> difficult than using it with Zend?

The point is that the wikitext and its parsing should be completely separate
from MediaWiki/PHP/HipHop/Zend.

I think some of the bigger picture is getting lost here. Wikimedia produces
XML dumps that contain wikitext. For most people, this is the only way to
obtain and reuse large amounts of content from Wikimedia wikis (especially
as the HTML dumps haven't been re-created since 2008). There needs to be a
way for others to be able to very easily deal with this content.

Many people have suggested (with good reason) that this means that wikitext
parsing needs to be reproducible in other programming languages. While
HipHop may be the best thing since sliced bread, I've yet to see anyone put
forward a compelling reason that the current state of affairs is acceptable.
Saying "well, it'll soon be much faster for MediaWiki to parse" doesn't
overcome the legitimate issues that re-users have (such as programming in a
language other than PHP, banish the thought).

For me, the idea that all that's needed is a faster parser in PHP is a
complete non-starter.

MZMcBride



_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to