Re: Intent to ship: WebCrypto API

2014-09-05 Thread Henri Sivonen
On Thu, Sep 4, 2014 at 6:29 PM, Ehsan Akhgari  wrote:
> On 2014-09-04, 4:42 AM, Anne van Kesteren wrote:
>>
>> On Wed, Sep 3, 2014 at 7:36 PM, Tim Taubert  wrote:
>>>
>>> Chromium has had the WebCrypto API enabled by default since Crome 37,
>>> which was released in late June 2014. Their implementation supports a
>>> subset of the algorithms that we do, and has roughly the same level of
>>> spec compliance. We expect the two implementations to be mostly
>>> interoperable as-is, with some fine points being ironed out as the spec
>>> is finalized.
>>
>>
>> In Chromium the methods only work on authenticated origins. What is
>> our story?
>
>
> I hope that we are not doing that.  I don't think Chromium's reasoning there
> makes sense.

I agree with Ehsan.

For reference, the Chromium "Intent to Ship" is
https://groups.google.com/a/chromium.org/forum/#!msg/blink-dev/Tn3pfJZDcGg/nUlvUOFKL_QJ
and says "This API will only be exposed to secure origins, as
security-sensitive operations are inherently dangerous when performed
on insecure connections."

It doesn't explain how exposing Web Crypto to http origins makes
things worse compared to http origins not having Web Crypto and
implementing the same algorithms in pure JS. Compared to pure JS, Web
Crypto provides potentially closer to constant-time implementations,
but that's an improvement and doesn't make things worse. Same for
general higher speed. Compared to regular JS storing key material in
IndexedDB, Web Crypto makes it possible to make the key material not
readable by JS (JS can request operations to be performed with the
keys but not read back the keys themselves). That, again, seems like
an improvement compared to not having Web Crypto available.

There's a possible argument that it's dangerous for people to be
confused about what sort of security characteristics Web Crypto can
provide on unauthenticated origin, but I think it is futile to try to
install clue into would-be users by requiring an authenticated origin:
the low-level nature of the API is such a foot gun (the name "subtle"
is an euphemism for that), that if someone is confused about what
security characteristics Web Crypto can provide on unauthenticated
origin, there are plenty of other things about Web Crypto to be
confused about in ways that lead to insecure outcomes. On the other
hand, if the threat model that's being addressed is a passive
eavesdropper (as opposed to an active attacker who changes network
traffic), Web Crypto can provide additional confidentiality compared
to no crypto being in use. Since in this case, the browser does not
make any representations in the UI to the user about the
confidentiality, integrity or authenticity of the communications,
nothing is being misrepresented to the user. However, a site that
considers it worthwhile to hide certain things from passive
eavesdroppers can still benefit.

As for making new features unavailable without TLS in order to promote
the use of TLS, I think the TLS requirement should be motivated by
some real security or deployability concerns to avoid unproductive
resentment by Web developers. (Otherwise, we should be restricting new
CSS to https!) Also, while restricting special-interest features might
feel good (by creating a feeling that we are restricting *something*)
and be more feasible than restricting features that have broad appeal,
restricting special-purpose features is less impactful than
restricting features that have broad appeal.

In this sense, I think restricting HTTP/2 to TLS is the right call.
The restriction to TLS addresses a true deployability concern.
Furthermore, better performance has broad appeal and, on the flip
side, being denied better performance just makes things slow or
instead of preventing stuff altogether. I also think that restricting
Service Workers to authenticated origins is right call, because
failure to do so would mean that an active MITM could continue to have
an effect after the actualy MITMing opportunity has gone (e.g. when
the user has left a café with an open Wi-Fi AP). I also think that
restricting privacy-sensitive APIs that need per-origin permissions to
authenticated origins is the right call, because if the origin isn't
authenticated, the browser can't really limit the scope of the
permission that has been granted, since an active MITM can inject
stuff to any unauthenticated origin that has obtained the permission.
(I'm not sure if getUserMedia is restricted along these line, but I
think it should be if it isn't already.)

Furthermore, APIs whose effects aren't tightly coupled with a
particular page are especially poor candidates for being restricted to
authenticated origins. If the API doesn't have effects that are
visible and coupled to a page (Web Crypto is a data in, data out API),
it's possible to serve an iframe from an https origin and offer a
postMessage interface that proxies the API to an http-origin parent
page. One might argue that this is progress, because then the site h

Re: Intent to implement and ship: ImageCapture

2014-09-05 Thread Henri Sivonen
On Wed, Sep 3, 2014 at 12:15 PM, Alfredo Yang  wrote:
> Summary:
> Allow web authors to take photo via gUM video track.

Does this have the same privacy protections as current gUM?

Is current gUM restricted to authenticated origins? If it isn't, is it
realistic to restrict it to authenticated origins?

I gather that gUM requires prompting even for packaged apps, which
seems good to me. Is that the case? However, the Camera app currently
has access to the camera without any prompting ever. Will the Camera
app start prompting or will it perhaps be pre-authorized somehow when
it migrates to gUM+this new API? If yes, could the pre-authorization
be reflected in App permissions? (Currently the Camera app clearly has
special powers that App permissions doesn't show, which makes me feel
I can't trust App permissions. See
https://bugzilla.mozilla.org/show_bug.cgi?id=1062246 )

-- 
Henri Sivonen
hsivo...@hsivonen.fi
https://hsivonen.fi/
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Intent to implement and ship: ImageCapture

2014-09-05 Thread Robert O'Callahan
On Fri, Sep 5, 2014 at 10:19 PM, Henri Sivonen  wrote:

> Does this have the same privacy protections as current gUM?
>

Yes. You can only use this on a stream you've already acquired (e.g. via
current gUM, but other APIs also produce streams). You can already shunt a
MediaStream to a  element and then drawImage that to a canvas to get
stream pixels.

Is current gUM restricted to authenticated origins? If it isn't, is it
> realistic to restrict it to authenticated origins?
>

That's a good idea but it's a separate issue.

Rob
-- 
oIo otoeololo oyooouo otohoaoto oaonoyooonoeo owohooo oioso oaonogoroyo
owoiotoho oao oboroootohoeoro oooro osoiosotoeoro owoiololo oboeo
osouobojoeocoto otooo ojouodogomoeonoto.o oAogoaoiono,o oaonoyooonoeo
owohooo
osoaoyoso otooo oao oboroootohoeoro oooro osoiosotoeoro,o o‘oRoaocoao,o’o
oioso
oaonosowoeoroaoboloeo otooo otohoeo ocooouoroto.o oAonodo oaonoyooonoeo
owohooo
osoaoyoso,o o‘oYooouo ofolo!o’o owoiololo oboeo oiono odoaonogoeoro
ooofo
otohoeo ofoioroeo ooofo ohoeololo.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Restricting gUM to authenticated origins only (was: Re: Intent to implement and ship: ImageCapture)

2014-09-05 Thread Henri Sivonen
On Fri, Sep 5, 2014 at 1:25 PM, Robert O'Callahan  wrote:
> On Fri, Sep 5, 2014 at 10:19 PM, Henri Sivonen  wrote:
>> Is current gUM restricted to authenticated origins? If it isn't, is it
>> realistic to restrict it to authenticated origins?
>
> That's a good idea but it's a separate issue.

Is it already being pursued or should I file a bug?

-- 
Henri Sivonen
hsivo...@hsivonen.fi
https://hsivonen.fi/
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Restricting gUM to authenticated origins only (was: Re: Intent to implement and ship: ImageCapture)

2014-09-05 Thread Robert O'Callahan
On Fri, Sep 5, 2014 at 10:34 PM, Henri Sivonen  wrote:

> On Fri, Sep 5, 2014 at 1:25 PM, Robert O'Callahan 
> wrote:
> > On Fri, Sep 5, 2014 at 10:19 PM, Henri Sivonen 
> wrote:
> >> Is current gUM restricted to authenticated origins? If it isn't, is it
> >> realistic to restrict it to authenticated origins?
> >
> > That's a good idea but it's a separate issue.
>
> Is it already being pursued or should I file a bug?
>

I don't know.

How about other site-specific sticky state? about:permissions suggests the
full list is
* Passwords
* Geolocation
* gUM
* Cookies
* Popup windows
* Offline storage
* Fullscreen
Cookies are segregated by http vs https, right? I hope other kinds of
offline storage, and passwords, are as well. Popup windows are just a
nuisance so not important here. That leaves gUM, geolocation and
fullscreen. Can we make them all require TLS?

Rob
-- 
oIo otoeololo oyooouo otohoaoto oaonoyooonoeo owohooo oioso oaonogoroyo
owoiotoho oao oboroootohoeoro oooro osoiosotoeoro owoiololo oboeo
osouobojoeocoto otooo ojouodogomoeonoto.o oAogoaoiono,o oaonoyooonoeo
owohooo
osoaoyoso otooo oao oboroootohoeoro oooro osoiosotoeoro,o o‘oRoaocoao,o’o
oioso
oaonosowoeoroaoboloeo otooo otohoeo ocooouoroto.o oAonodo oaonoyooonoeo
owohooo
osoaoyoso,o o‘oYooouo ofolo!o’o owoiololo oboeo oiono odoaonogoeoro
ooofo
otohoeo ofoioroeo ooofo ohoeololo.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Restricting gUM to authenticated origins only (was: Re: Intent to implement and ship: ImageCapture)

2014-09-05 Thread Henri Sivonen
On Fri, Sep 5, 2014 at 1:47 PM, Robert O'Callahan  wrote:
> On Fri, Sep 5, 2014 at 10:34 PM, Henri Sivonen  wrote:
>>
>> On Fri, Sep 5, 2014 at 1:25 PM, Robert O'Callahan 
>> wrote:
>> > On Fri, Sep 5, 2014 at 10:19 PM, Henri Sivonen 
>> > wrote:
>> >> Is current gUM restricted to authenticated origins? If it isn't, is it
>> >> realistic to restrict it to authenticated origins?
>> >
>> > That's a good idea but it's a separate issue.
>>
>> Is it already being pursued or should I file a bug?
>
>
> I don't know.
>
> How about other site-specific sticky state? about:permissions suggests the
> full list is
> * Passwords

It's not really worthwhile not to remember passwords on http origins,
since the user will then type the password anyway, so we can't really
protect the user against an active MITM going after passwords on http
sites.

> * Geolocation

In principle, I think geolocation should be restricted to
authenticated origins. Unfortunately, it might be too late
compatibility-wise to do that at this point. Also, since the
geolocation responses are easily proxied over postMessage, I think the
potential for wind is less than with gUM, whose response is a special
kind of object that doesn't travel (I hope!) over postMessage.

> * gUM

Yes. I'm hoping that the non-demo uses of gUM are already often enough
on authenticated origins for it not to be too late to restrict gUM to
authenticated origins.

> * Cookies

Being able to limit cookies to authenticated origins only would be a
big win, but it would probably "break the Web "too much. :-(

> * Popup windows

As far as annoyances go, Web Notifications require a permission, right?

> * Offline storage

This one is tricky, since it is somewhere between cookies, which we
probably can't fix, and Service Workers, which we can. Probably by
now, it would "break the Web" too much to limit offline storage to
authenticated origins only.

> * Fullscreen

Unfortunately, this has use cases together with MSE and sites that use
MSE will probably have the hardest time migrating to authenticated
origins, because mixed-content XHR is blocked, MSE assumes the media
data to be fetched using XHR and migrating the media data to https is
a big deal. Assuming that all browsers together don't adopt some way
to do mixed-content XHR such that the response data can't be read by
JS but can be pushed to MSE. Since MSE is already out there in IE,
Chrome and test versions of Safari and we are under notable pressure
to ship MSE ASAP, it seems too late to introduce such a mixed-content
exception mechanism for XHR. :-(

> Cookies are segregated by http vs https, right?

Cookies marked secure and served over https are. Cookies not marked
secure infamously are not. (And the Same Origin violations of cookies
don't even stop at the scheme, since they aren't fully segregated by
host name, either. It's very sad.)

> I hope other kinds of
> offline storage, and passwords, are as well.

My understanding is that stuff other than cookies is Origin-based, so
the scheme matters.

> Popup windows are just a
> nuisance so not important here.

Yes.

> That leaves gUM, geolocation and fullscreen.
> Can we make them all require TLS?

For the reasons given above, I think we could require authenticated
origin for gUM but not fullscreen. Requiring an authenticated origin
for geolocation would break things, but we *might* be able to live
with the level of breakage. (Though likely not.)

-- 
Henri Sivonen
hsivo...@hsivonen.fi
https://hsivonen.fi/
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Restricting gUM to authenticated origins only (was: Re: Intent to implement and ship: ImageCapture)

2014-09-05 Thread Eric Rescorla
On Fri, Sep 5, 2014 at 3:34 AM, Henri Sivonen  wrote:

> On Fri, Sep 5, 2014 at 1:25 PM, Robert O'Callahan 
> wrote:
> > On Fri, Sep 5, 2014 at 10:19 PM, Henri Sivonen 
> wrote:
> >> Is current gUM restricted to authenticated origins? If it isn't, is it
> >> realistic to restrict it to authenticated origins?
> >
> > That's a good idea but it's a separate issue.
>
> Is it already being pursued or should I file a bug?


It's not being pursued. It was considered in the WG and rejected.

-Ekr
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Review Board Preview

2014-09-05 Thread Ehsan Akhgari

This is fantastic!  Thanks for all of the team effort here.

Everyone, please take a few minutes to check this out.  Seriously.  :)

On 2014-09-04, 8:57 PM, Mark Côté wrote:

I know lots of people are very interested in the on-going project to
replace Splinter with a modern code-review tool.  After a colourful
variety of setbacks, this project[1], based on Review Board[2], is very
nearly ready for initial deployment.  I put up a preview screencast on
my blog[3] to give you an idea of what to expect.  Barring any other
unforeseen circumstances, it should be ready for use on a select number
of repositories in a couple weeks.

Mark

[1] https://wiki.mozilla.org/Auto-tools/Projects/CodeReviewTool
[2] https://www.reviewboard.org/
[3] https://mrcote.info/blog/2014/09/04/review-board-preview/
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform



___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Deferred display of XUL panel containing embedded iframe

2014-09-05 Thread Yonggang Luo
I was also insert a iframe into panel, but the problem that i faced is the 
autohide doesn't works, the panel act like this: it's weirdly calling the 
hidden and show  event, that's should not be happen.
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


web-platform-tests now running in automation

2014-09-05 Thread James Graham
The web-platform-tests testsuite has just landed on
Mozilla-Central. It is an import of a testsuite collated by the W3C
[1], which we intend to keep up-to-date with upstream. The tests are
located in /testing/web-platform/tests/ and are now running in automation.

Initially the testsuite, excluding the reftests, is running on Linux
64 opt builds only. If it doesn't cause problems there it will be
rolled out to other configurations, once we are confident they will
be equally stable.

The jobs indicated on tbpl and treeherder by the symbols W1-W4. The
reftests will be Wr once they are enabled.

== How does this affect me? ==

Because web-platform-tests is imported from upstream we can't make
assumptions like "all tests will pass". Instead we explicitly store
the expected result of every test that doesn't just pass in an
ini-like file with the same name as the test and a .ini suffix in
/testing/web-platform/meta/. If you make a change that affects the
result of a web-platform-test you need to update the expected results
or the testsuite will go orange.

Instructions for performing the updates are in the README file
[2]. There is tooling available to help in the update process.

== OK, so how do I run the tests? ==

Locally, using mach:

mach web-platform-tests

or, to run only a subset of tests:

mach web-platform-tests --include=dom/

To run multiple tests at once (at the expense of undefined ordering
and greater in-determinism), use the --processes=N option.

The tests are also available on Try; the trychooser syntax is

-u web-platform-tests

Individual chunks can also be run, much like for mochitest.

It's also possible to just start the web server and load tests into
the browser, as long as you add the appropriates entries to your hosts
file. These are documented in the web-platform-tests README file
[3]. Once these are added running

python serve.py

in testing/web-platform/tests will start the server and allow the
tests to be loaded from http://web-platform.test:8000.

== What does it mean if the tests are green? ==

It means that there are no "unexpected" results. These expectations
are set based on the existing behaviour of the browser. Every time the
tests are updated the expectations will be updated to account for
changes in the tests. It does *not* mean that there are no tests that
fail. Indeed there may be tests that have even worse behaviour like
hanging or crashing; as long as the behaviour is stable, the test will
remain enabled (this can ocassionally have somewhat wonky interaction
with the tbpl UI. When looking at jobs, unexpected results always start
TEST-UNEXPECTED-).

So far I haven't spent any time filing bugs about issues found by the
tests, but there is a very basic report showing those that didn't pass
at [4]. I am very happy to work with people with some insight into
what bugs have already been filed to get new issues into Bugzilla. I
will also look at making a continually updated HTML report. In the
longer term I am hopeful that this kind of reporting can become part
of the Treeherder UI so it's easy to see not just where we have
unexpected results but also where there are expected failures
indicating buggy code.

== What kinds of things are covered by these tests? ==

web-platform-tests is, in theory, open to any tests for web
technologies. In practice most of the tests cover technologies in the
WHATWG/W3C stable e.g. HTML, DOM, various WebApps specs, and so
on. The notable omission is CSS; for historical reasons the CSS tests
are still in their own repository. Convergence here is a goal for the
future.

== We already have mochitests; why are we adding a new testsuite? ==

Unlike mochitests, web-platform-tests are designed to work in any
browser. This means that they aren't just useful for avoiding
regressions in Gecko, but also for improving cross-browser interop;
when developing features we can run tests that other implementers have
written, and they can run tests we have written. This will allow us to
detect compatibility problems early in a feature's life-cycle, before
they have the chance to become a source of frustration for
authors. With poor browser compatibility being one of the main
complaints about developing for the web, improvements in this area are
critical for the ongoing success of the platform.

== So who else is running the web-platform-tests? ==

* Blink run some of the tests in CI ([5] and various other locations
  scattered though their tree)
* The Servo project are running all the tests for spec areas they have
  implemented in CI [6]
* Microsoft have an Internet-Explorer compatible version of the test runner.

In addition we are using web-platform-tests as one component of the
FirefoxOS certification suite.

The harness [7] we are using for testing Gecko is browser-agnostic so
it's possible to experiment with running tests in other browsers. In
particular it supports Firefox OS, Servo and Chrome, and Microsoft
have patches to support IE. Adding support for ot

Re: web-platform-tests now running in automation

2014-09-05 Thread Kyle Huey
On Fri, Sep 5, 2014 at 8:55 AM, James Graham  wrote:
> The web-platform-tests testsuite has just landed on
> Mozilla-Central. It is an import of a testsuite collated by the W3C
> [1], which we intend to keep up-to-date with upstream. The tests are
> located in /testing/web-platform/tests/ and are now running in automation.
>
> Initially the testsuite, excluding the reftests, is running on Linux
> 64 opt builds only. If it doesn't cause problems there it will be
> rolled out to other configurations, once we are confident they will
> be equally stable.
>
> The jobs indicated on tbpl and treeherder by the symbols W1-W4. The
> reftests will be Wr once they are enabled.
>
> == How does this affect me? ==
>
> Because web-platform-tests is imported from upstream we can't make
> assumptions like "all tests will pass". Instead we explicitly store
> the expected result of every test that doesn't just pass in an
> ini-like file with the same name as the test and a .ini suffix in
> /testing/web-platform/meta/. If you make a change that affects the
> result of a web-platform-test you need to update the expected results
> or the testsuite will go orange.
>
> Instructions for performing the updates are in the README file
> [2]. There is tooling available to help in the update process.
>
> == OK, so how do I run the tests? ==
>
> Locally, using mach:
>
> mach web-platform-tests
>
> or, to run only a subset of tests:
>
> mach web-platform-tests --include=dom/
>
> To run multiple tests at once (at the expense of undefined ordering
> and greater in-determinism), use the --processes=N option.
>
> The tests are also available on Try; the trychooser syntax is
>
> -u web-platform-tests
>
> Individual chunks can also be run, much like for mochitest.
>
> It's also possible to just start the web server and load tests into
> the browser, as long as you add the appropriates entries to your hosts
> file. These are documented in the web-platform-tests README file
> [3]. Once these are added running
>
> python serve.py
>
> in testing/web-platform/tests will start the server and allow the
> tests to be loaded from http://web-platform.test:8000.
>
> == What does it mean if the tests are green? ==
>
> It means that there are no "unexpected" results. These expectations
> are set based on the existing behaviour of the browser. Every time the
> tests are updated the expectations will be updated to account for
> changes in the tests. It does *not* mean that there are no tests that
> fail. Indeed there may be tests that have even worse behaviour like
> hanging or crashing; as long as the behaviour is stable, the test will
> remain enabled (this can ocassionally have somewhat wonky interaction
> with the tbpl UI. When looking at jobs, unexpected results always start
> TEST-UNEXPECTED-).
>
> So far I haven't spent any time filing bugs about issues found by the
> tests, but there is a very basic report showing those that didn't pass
> at [4]. I am very happy to work with people with some insight into
> what bugs have already been filed to get new issues into Bugzilla. I
> will also look at making a continually updated HTML report. In the
> longer term I am hopeful that this kind of reporting can become part
> of the Treeherder UI so it's easy to see not just where we have
> unexpected results but also where there are expected failures
> indicating buggy code.
>
> == What kinds of things are covered by these tests? ==
>
> web-platform-tests is, in theory, open to any tests for web
> technologies. In practice most of the tests cover technologies in the
> WHATWG/W3C stable e.g. HTML, DOM, various WebApps specs, and so
> on. The notable omission is CSS; for historical reasons the CSS tests
> are still in their own repository. Convergence here is a goal for the
> future.
>
> == We already have mochitests; why are we adding a new testsuite? ==
>
> Unlike mochitests, web-platform-tests are designed to work in any
> browser. This means that they aren't just useful for avoiding
> regressions in Gecko, but also for improving cross-browser interop;
> when developing features we can run tests that other implementers have
> written, and they can run tests we have written. This will allow us to
> detect compatibility problems early in a feature's life-cycle, before
> they have the chance to become a source of frustration for
> authors. With poor browser compatibility being one of the main
> complaints about developing for the web, improvements in this area are
> critical for the ongoing success of the platform.
>
> == So who else is running the web-platform-tests? ==
>
> * Blink run some of the tests in CI ([5] and various other locations
>   scattered though their tree)
> * The Servo project are running all the tests for spec areas they have
>   implemented in CI [6]
> * Microsoft have an Internet-Explorer compatible version of the test runner.
>
> In addition we are using web-platform-tests as one component of the
> FirefoxOS certification suite.
>
> The harn

Re: web-platform-tests now running in automation

2014-09-05 Thread Boris Zbarsky

On 9/5/14, 11:55 AM, James Graham wrote:

The web-platform-tests testsuite has just landed on
Mozilla-Central.


This is fantastic.  Thank you!

Does this obsolete our existing "imptests" tests, or is this a set of 
tests disjoint from those?


-Boris
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: web-platform-tests now running in automation

2014-09-05 Thread James Graham
On 05/09/14 18:00, Boris Zbarsky wrote:
> On 9/5/14, 11:55 AM, James Graham wrote:
>> The web-platform-tests testsuite has just landed on
>> Mozilla-Central.
> 
> This is fantastic.  Thank you!
> 
> Does this obsolete our existing "imptests" tests, or is this a set of
> tests disjoint from those?

I think Ms2ger has a better answer here, but I believe it obsoletes most
of them, except a few that never got submitted to web-platform-tests
(the editing tests are in that class, because the spec effort sort of died).

I've filed bug 1063632 to remove the imptests once we have better
platform coverage from web-platform-tests.

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Restricting gUM to authenticated origins only

2014-09-05 Thread Chris Peterson

On 9/5/14 4:39 AM, Henri Sivonen wrote:

>* Geolocation

In principle, I think geolocation should be restricted to
authenticated origins. Unfortunately, it might be too late
compatibility-wise to do that at this point. Also, since the
geolocation responses are easily proxied over postMessage, I think the
potential for wind is less than with gUM, whose response is a special
kind of object that doesn't travel (I hope!) over postMessage.


Google Maps and Yahoo Maps use HTTPS, but MapQuest and Bing Maps use 
HTTP. Before we could restrict geolocation to authenticated origins, we 
would need to convince Microsoft and MapQuest to use HTTPS (or whitelist 
their sites).



chris

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Restricting gUM to authenticated origins only

2014-09-05 Thread Ehsan Akhgari

On 2014-09-05, 4:37 PM, Chris Peterson wrote:

On 9/5/14 4:39 AM, Henri Sivonen wrote:

>* Geolocation

In principle, I think geolocation should be restricted to
authenticated origins. Unfortunately, it might be too late
compatibility-wise to do that at this point. Also, since the
geolocation responses are easily proxied over postMessage, I think the
potential for wind is less than with gUM, whose response is a special
kind of object that doesn't travel (I hope!) over postMessage.


Google Maps and Yahoo Maps use HTTPS, but MapQuest and Bing Maps use
HTTP. Before we could restrict geolocation to authenticated origins, we
would need to convince Microsoft and MapQuest to use HTTPS (or whitelist
their sites).


Those are not the only websites using the API.  There are many more.  I 
think we have probably lost our chance to make any changes here.


___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Restricting gUM to authenticated origins only

2014-09-05 Thread Chris Peterson


On 9/5/14 2:38 PM, Ehsan Akhgari wrote:

Google Maps and Yahoo Maps use HTTPS, but MapQuest and Bing Maps use
HTTP. Before we could restrict geolocation to authenticated origins, we
would need to convince Microsoft and MapQuest to use HTTPS (or whitelist
their sites).


Those are not the only websites using the API.  There are many more.  
I think we have probably lost our chance to make any changes here. 


Yes. Sorry, I didn't make myself clear. I meant that, if major map 
websites like Bing and MapQuest are using geolocation without HTTPS, 
then the longtail of HTTP sites is probably too long to ever fix. :\

___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Restricting gUM to authenticated origins only

2014-09-05 Thread Ehsan Akhgari

On 2014-09-05, 5:46 PM, Chris Peterson wrote:


On 9/5/14 2:38 PM, Ehsan Akhgari wrote:

Google Maps and Yahoo Maps use HTTPS, but MapQuest and Bing Maps use
HTTP. Before we could restrict geolocation to authenticated origins, we
would need to convince Microsoft and MapQuest to use HTTPS (or whitelist
their sites).


Those are not the only websites using the API.  There are many more. I
think we have probably lost our chance to make any changes here.


Yes. Sorry, I didn't make myself clear. I meant that, if major map
websites like Bing and MapQuest are using geolocation without HTTPS,
then the longtail of HTTP sites is probably too long to ever fix. :\


Yep, unfortunately, agreed!
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: web-platform-tests now running in automation

2014-09-05 Thread Boris Zbarsky

On 9/5/14, 11:55 AM, James Graham wrote:

Instructions for performing the updates are in the README file
[2]. There is tooling available to help in the update process.


Is there a way to document the spec or test suite bugs in the 
expectations file?  e.g. if I want to add an "expected: FAIL" and link 
to https://github.com/w3c/web-platform-tests/issues/1223 as an 
explanation for why exactly we're failing it.


-Boris
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform


Re: Restricting gUM to authenticated origins only

2014-09-05 Thread Martin Thomson
One idea that has been floated 
(https://bugzilla.mozilla.org/show_bug.cgi?id=1002676) is to restrict 
persistent permissions to secure origins.  The reasoning there being that a 
persistent grant can be trivially intercepted if you work in the clear.  That's 
a real security concern.  One that gUM requires.

We might like to consider extending that to geolocation too.  But it's not 
clear that the security benefits are outweighed by the inconvenience.  The real 
solution is for those sites to get their act together, but that's a tall order.

I agree with Henri and others who have said that we shouldn't be following 
Google's example regarding restricting feature access to secure origins though.

- Original Message -
From: "Ehsan Akhgari" 
To: "Chris Peterson" , dev-platform@lists.mozilla.org
Sent: Friday, September 5, 2014 2:53:21 PM
Subject: Re: Restricting gUM to authenticated origins only

On 2014-09-05, 5:46 PM, Chris Peterson wrote:
>
> On 9/5/14 2:38 PM, Ehsan Akhgari wrote:
>>> Google Maps and Yahoo Maps use HTTPS, but MapQuest and Bing Maps use
>>> HTTP. Before we could restrict geolocation to authenticated origins, we
>>> would need to convince Microsoft and MapQuest to use HTTPS (or whitelist
>>> their sites).
>>
>> Those are not the only websites using the API.  There are many more. I
>> think we have probably lost our chance to make any changes here.
>
> Yes. Sorry, I didn't make myself clear. I meant that, if major map
> websites like Bing and MapQuest are using geolocation without HTTPS,
> then the longtail of HTTP sites is probably too long to ever fix. :\

Yep, unfortunately, agreed!
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform
___
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform