> "Barry (work)" writes:
> >> >As a general point, about the need for parsers to be forgiving - there is
> >an
> >> >argument that 'over-aggressive' parsers are useful because they prevent
> >> >unpredictable and potentially serious operational problems
> >> >from arising in a complex interworking environment.

I'm not sure the problem is that interoperability is at danger. As a lot
of people have pointed out interoperability is helped by receivers being
flexible in what they accept. I do think Barry has got a point, though,
in that very flexible implementations does nothing to discourage
breaking the standards as long as things work.  An obvious example of
where this has happened is the Web where extremely lax browsers has
meant that a huge amount of malformed HTML is now in existence. People
might run notepad, handwrite some malformed HTML, checks that it renders
OK with Netscape, and then publish it. 

So one can certainly argue against browsers being so very liberal in
what they accept, and a lot of people have done that. In fact, the
designers of XML wanted to avoid the situation of HTML and so included
text in the spec explicitly prohibiting parsers from accepting malformed
XML documents, even when the intent might be guessed.

Maybe it wouldn't be such a bad thing if implementations at the bakeoffs
at least generated warning messages (does anyone use the Warning:
header?) or something like that.

Anders

-- 
Anders Kristensen <[EMAIL PROTECTED]>,
http://www-uk.hpl.hp.com/people/ak/
Hewlett-Packard Labs, Bristol, UK

Reply via email to