Re: W3C Proposed Recommendation: HTML5

2014-09-21 Thread Boris Zbarsky

On 9/21/14, 9:00 AM, James Graham wrote:

I am substantially less convinced that tying these tests to the spec
lifecycle makes sense.


Agreed.  The only reason it's an issue for me is the lack of 
errata-issuance by the W3C and hence the tendency to attempt to enshrine 
obviously-wrong things in specifications forever.



The W3C Process encourages people to think of
interoperability as a binary condition; either implementations are
interoperable or they are not.


More interestingly, either the specification is implementable or not. 
Again, because once the REC is published everyone goes home and never 
touches that document again.


The two implementations condition was there to make sure you didn't end 
up with specs that basically reflected one UA's internals and that no 
one else could implement



However the incentives for testing
under such a regime are not well aligned


Yes, absolutely agreed.


I would much prefer to
have a testsuite written by people making a genuine effort to find
errors in implementations even if the result is that no one passes every
single test.


This would be much more helpful, absolutely.

Of course then you need to ask yourself whether the test failures are 
just bugs in implementations or fundamental problems with the spec.  A 
question spec writers are often loath to ask.  :(



I'm also dubious that requiring a test for every assertion in the spec
is a standard that we are prepared to hold ourselves to when we ship
code.


Indeed.


Since shipping code is, in the grand scheme of things,
substantially more important than shipping a spec — the former affecting
all our users and the latter only lawyers


I wish specs only affected lawyers, especially given how they are often 
created/maintained.


Sadly, they do affect web developers and browser developers, not to 
mention other specs.



it doesn't seem at all
reasonable to demand that the specification is held to a higher standard.


Note that I made no such demand, precisely because I think it's unrealistic.


My concrete suggestion is that we, as an organisation, work to achieve
parity between the tests we require to ship our own code and those we
release in ways that can be used to support a spec and, more
importantly, those that can be used verify that different
implementations match up.


This is a good idea, but not terribly relevant to what dbaron should say 
in his AC rep capacity, right?



Making this process as low-overhead as
possible is something that I'm working on.


And it's much appreciated!

Note that actually sanely testing something like navigation in 
non-browser-specific ways is ... hard.  Basic things like "open a 
cross-origin window and wait for it to load" aren't really possible.  :(



Obviously this isn't going to make a short-term difference for old
features like WindowProxy. I'm not sure what to suggest for those cases
given that we have de-facto shown an unwillingness to invest even
relatively small amounts of effort into reviewing existing tests that
could be part of the HTML testsuite for that feature [1].


Missing reference?


In the longer term, one might hope that bugfixes will produce new
testcases that could be upstreamed, and Servo might need a proper
testsuite to achieve interoperability. Having said that, Servo has so
far not produced a significant number of tests, which has been a little
surprising as they have been implementing some of the core pieces of the
platform which are indeed under-tested. I suspect this is because the
skills, interests and goals of the team are around producing code rather
than tests.


Yep.  And because they really haven't been aiming for full web compat 
yet, I'd hope, but rather prototyping out some of the parallalelism ideas.



The WG turned into an
unproductive, bureaucratic, nightmare and succeeded in driving out their
core constituency leaving the remaining participants to debate topics of
little consequence.


Yup.


If we were to complain about the lack of testing for HTML


Again, note that I don't think we have any realistic way to complain 
about it, for precisely the reasons you list.



This idea of shipping on a date-based schedule isn't actually
obviously bad, as long as you set the expectations correctly, which is
something the W3C will get wrong.


I hope you're wrong (e.g. that the W3C will actually continue fairly 
frequent date-based updates to HTML), but I fear you're right.



So yes, I think that, at the moment, "everybody knows" looking at HTML
under w3.org/TR/ is a mistake. Even if they don't it's probably not
*that* important because HTML isn't the most interesting spec right now;


Unless you're Servo, yeah.

-Boris
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: http-schemed URLs and HTTP/2 over unauthenticated TLS (was: Re: WebCrypto for http:// origins)

2014-09-21 Thread Richard Barnes
Pretty sure that what he's referring to is called DANE.  It lets a domain 
holder assert a certificate or key pair, using DNSSEC to bind it to the domain 
instead of PKIX (or in addition to PKIX).

https://tools.ietf.org/html/rfc6698



On Sep 21, 2014, at 8:01 AM, Anne van Kesteren  wrote:

> On Sun, Sep 21, 2014 at 1:14 PM, Aryeh Gregor  wrote:
>> What happened to serving certs over DNSSEC?  If browsers supported
>> that well, it seems it has enough deployment on TLDs and registrars to
>> be usable to a large fraction of sites.
> 
> DNSSEC does not help with authentication of domains and establishing a
> secure communication channel as far as I know. Is there a particular
> proposal you are referring to?
> 
> 
> -- 
> https://annevankesteren.nl/
> ___
> dev-platform mailing list
> dev-platform@lists.mozilla.org
> https://lists.mozilla.org/listinfo/dev-platform

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: W3C Proposed Recommendation: HTML5

2014-09-21 Thread James Graham
On 20/09/14 03:46, Boris Zbarsky wrote:
> On 9/19/14, 8:23 PM, L. David Baron wrote:
>> W3C recently published the following proposed recommendation (the
>> stage before W3C's final stage, Recommendation):
> 
> The biggest issue I have with this is exiting CR without anything
> resembling a comprehensive enough test suite to ensure anything like
> interop on some of the core hard pieces (they left out the navigation
> algorithm, smart, but still have the bogus WindowProxy spec in this
> supposed PR, for example).

Obviously I agree that good testing of implementations is key to
interoperability. I also agree that we should encourage vendors to
create and run shared tests for the web technologies that we implement.
I am substantially less convinced that tying these tests to the spec
lifecycle makes sense. The W3C Process encourages people to think of
interoperability as a binary condition; either implementations are
interoperable or they are not. This leads to ideas like the CSS WG's
rule that two implementations must pass every test. On the face of it
this may appear eminently sensible. However the incentives for testing
under such a regime are not well aligned with the goal of finding bugs
in implementations; in essence people are encouraged to write as many
tests as are needed to satisfy the letter of the rules but to make them
all as shallow and unlikely to find bugs as possible to avoid causing
unwanted holdups in the specification process. I would much prefer to
have a testsuite written by people making a genuine effort to find
errors in implementations even if the result is that no one passes every
single test.

Of course it's possible that going through the spec and making sure
there is at least one test for every conformance criterion will make the
testsuite good even if people are determined to produce a poor
testsuite. However I strongly doubt this to be the case. Indeed I'm only
aware of a handful of examples of someone setting out to write a test
for every conformance criterion in a specification and ending up with a
useful result. The canvas / 2d context tests are one of those cases, and
that benefits from being a rather self-contained set of operations
without much interaction with the rest of the platform. Even if someone
took the same approach to, say, document loading tests, it is unlikely
that the result would be as good because the features are much more
likely to interact in unpleasant ways so that, for example, the load
event and document.open might work independently but using the two in
combination might break in an unexpected way.

I'm also dubious that requiring a test for every assertion in the spec
is a standard that we are prepared to hold ourselves to when we ship
code. Since shipping code is, in the grand scheme of things,
substantially more important than shipping a spec — the former affecting
all our users and the latter only lawyers — it doesn't seem at all
reasonable to demand that the specification is held to a higher standard.

> My second biggest issue is that I don't have a concrete proposal for
> addressing this the first issue.

My concrete suggestion is that we, as an organisation, work to achieve
parity between the tests we require to ship our own code and those we
release in ways that can be used to support a spec and, more
importantly, those that can be used verify that different
implementations match up. In practice this means not writing tests in
Mozilla-specific formats, and making sure that we have a way to upstream
tests that we've written. Making this process as low-overhead as
possible is something that I'm working on.

Obviously this isn't going to make a short-term difference for old
features like WindowProxy. I'm not sure what to suggest for those cases
given that we have de-facto shown an unwillingness to invest even
relatively small amounts of effort into reviewing existing tests that
could be part of the HTML testsuite for that feature [1].

In the longer term, one might hope that bugfixes will produce new
testcases that could be upstreamed, and Servo might need a proper
testsuite to achieve interoperability. Having said that, Servo has so
far not produced a significant number of tests, which has been a little
surprising as they have been implementing some of the core pieces of the
platform which are indeed under-tested. I suspect this is because the
skills, interests and goals of the team are around producing code rather
than tests. For people making small contributions it would also be
rather off-putting to be told "no you can't land this fix that makes
Wikipedia look better without a comprehensive testsuite for the relevant
feature". However if we as an organisation really care about testing
core platform features which already have an implementation in gecko,
one way to achieve that would be to give someone working on Servo the
explicit job of creating testsuites for the big-ticket features they
implement as they implement them.

> Maybe it all d

Re: http-schemed URLs and HTTP/2 over unauthenticated TLS (was: Re: WebCrypto for http:// origins)

2014-09-21 Thread Anne van Kesteren
On Sun, Sep 21, 2014 at 1:14 PM, Aryeh Gregor  wrote:
> What happened to serving certs over DNSSEC?  If browsers supported
> that well, it seems it has enough deployment on TLDs and registrars to
> be usable to a large fraction of sites.

DNSSEC does not help with authentication of domains and establishing a
secure communication channel as far as I know. Is there a particular
proposal you are referring to?


-- 
https://annevankesteren.nl/
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: http-schemed URLs and HTTP/2 over unauthenticated TLS (was: Re: WebCrypto for http:// origins)

2014-09-21 Thread Aryeh Gregor
On Mon, Sep 15, 2014 at 11:34 AM, Anne van Kesteren  wrote:
> It seems very bad if those kind of devices won't use authenticated
> connections in the end. Which makes me wonder, is there some activity
> at Mozilla for looking into an alternative to the CA model?

What happened to serving certs over DNSSEC?  If browsers supported
that well, it seems it has enough deployment on TLDs and registrars to
be usable to a large fraction of sites.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform