2009/9/25 Roan Kattouw <roan.katt...@gmail.com>:

> The point is that wikitext doesn't *have* parsing errors. The parser
> is very tolerant in that it tries to resolve 'invalid' and ambiguous
> constructs by some combination of guessing what was probably intended
> and trying not to mess up the rest of the article (the newline thing
> mentioned earlier fall in the latter category). I agree that this
> probably causes the weird quirks that make wikitext such a horribly
> complex language to define and parse, so I think a good way to
> continue this discussion would be to talk about how invalid, ambiguous
> and otherwise unexpected input should be handled.


In past discussions I have noted that "tag soup" is a *feature* of
human languages, not a bug.

HTML was constructed as a computer markup language for humans.
Unfortunately, the humans in question were nuclear physicists; lesser
beings couldn't handle it.

Note how HTML5 now defines how to handle "bad" syntax, in recognition
of the fact that humans write tag soup.

Wikitext spitting errors would be a bug in wikitext, not a feature.


- d.

_______________________________________________
Wikitech-l mailing list
Wikitech-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to