On Thursday, June 18, 2015 at 9:32:30 PM UTC+2, Chris Peterson wrote:
> It sounds like there are three use cases for WEBGL_debug_renderer_info:
> 
> 1. Correlating GPU info with bug reports (e.g. YouTube).
> 2. Web content workarounds for GPU bugs.
> 3. Fingerprinting for user tracking.

There are more usecases to list just a few:

* performance (preset) targeting. Where a guess is made as to which performance 
tier preset is used rather than starting at the lowest tier and using FPS 
ranging (which has its own problems due to FPS capping) and/or user input to 
arrive at a satisfactory setting.
* General GPU statistics (much like user-agent and capability statistics) which 
would be quite eye opening to many WebGL developers.
* Site-specific GPU statistics (because general statistics are rarely 
representative of a given sites visitor base)
* GPU statistics can inform web-developers what devices are representative for 
their visitor-base, so they know what machines to test their code on (rather 
than just the latest and greates GPU that happens to find itself in so many 
WebGL developers machines)
* QA verification of deployments to identify trouble spots, gauge their impact 
and prioritize engineering time to deal with the issue (actual scenario I'm in 
right now)
* Comperative performance targeting: By analyzing the GPU population you see, 
and comparing it to the performance you get for your usecase on your test farm, 
you can come up with a factor to assign to GPUs (using other GPU benchmark 
numbers such as those found on gfxbench.com as well as GPU information such as 
core-count and flops). This factor can quantitatively tell you how fast your 
application will run for how many of your visitors, and you can adjust your 
development accordingly to make sure you're not building features into your 
application that'll make the experience far less enjoyable for the majority of 
your visitors.

> To get #1, some of #2, and none of #3, can we just whitelist 
> WEBGL_debug_renderer_info for youtube.com or a couple of A-tier sites?
I (and many others on the open web) are very much not in favor of features that 
give you an advantage which are denied to the general public and are invite 
only.
 
A couple of other points I'd like to make about GPU strings:

Fingerprinting is hard to objectively quantify (because of the variety of 
privacy protecting features these days such as etag/cookie wiping). However a 
lot of the suspected bits that a GPU string would expose, area already 
contained in in the parameters that WebGL 1 exposes. And WebGL 2 comes with 
twice as many parameters to query. I don't feel a discussion on the dangers of 
fingerprinting is terribly conductive without any quantification. There's also 
a certain element of mootness to discuss the implications of a few more bits to 
fingerprint a user, when users can already be supercookied/fingerprinted 
perfectly without it.

Internet Explorer and Chrome enabled this feature and the world hasn't ended. 
By far and large we're not seeing breakage because somebody used that feature 
inappropriately. WebGL developers do want their content to run on as many 
visitors machines as is possible and we're very conscious of making decisions 
and code that limits that.

In closing, the GPU string helps in a variety of ways and over-simplifying it 
to 2 "use-cases" does a disservice to that. It is also generally useful to 
anybody who deploys WebGL content, and has the wherewithal/budget to make use 
of that information, and it shouldn't be a browsers place to tell who may or 
who may not.
_______________________________________________
dev-platform mailing list
dev-platform@lists.mozilla.org
https://lists.mozilla.org/listinfo/dev-platform

Reply via email to