Allowing erroneous pages to render is not inherently bad, especially with 
standards that can change in the future. Look at earlier HTML (< 4) vs XHTML. 
If a page is written in XHTML and is read by a browser that doesn't understand 
XHTML, the browser still attempting to render it by ignoring any junk it 
doesn't know how to handle (the self-closing tags, for example) would be a very 
desirable thing for the end user. I don't think this is a good counter-example 
for this reason. 

On July 9, 2020 4:59:09 PM EDT, Michael Conrad <mcon...@intellitree.com> wrote:
>On 7/9/2020 3:16 PM, Markus Gothe wrote:
>> Jon Postel formulated the robustness principle decades ago. Still 
>> today it is a good advice to "be liberal in what you accept and
>strict 
>> in what you send".
>
>Counterexample: Internet Explorer
>
>It allowed so much garbage to render correctly that other browser 
>vendors had to work overtime to accept all the same garbage and make 
>sure it rendered in the same way.  Then, subsequently when IE was no 
>longer defining the standard, progress was hamstrung by needing to be 
>compatible with its own past allowances lest they be accused of
>breaking 
>people intranets.  So much so that they just weren't able to fix most
>of 
>their bugs and eventually abandoned the project.  If they had just 
>declared tighter standards and enforced the rules, web development
>might 
>not have been a misery for an entire decade.
>
>_______________________________________________
>busybox mailing list
>busybox@busybox.net
>http://lists.busybox.net/mailman/listinfo/busybox

-- 
Sent from my Android phone with K-9 Mail. Please excuse my brevity.
_______________________________________________
busybox mailing list
busybox@busybox.net
http://lists.busybox.net/mailman/listinfo/busybox

Reply via email to