Jeremy Doig wrote:
Measuring the rate at which the playback buffer is filling/emptying gives a
fair indication of network goodput, but there does not appear to be a way to
measure just how well the client is playing the video itself. If I have a
wimpy machine behind a fat network connection, you may flood me with HD that
I just can't play very well. The cpu or video card may just not be able to
render the video well.Exposing a metric (eg: Dropped Frame count, rendered
frame rate) would allow sites to dynamically adjust the video which is being
sent to a client [eg: switch the url to a differently encoded file] and
thereby optimize the playback experience.
Anyone else think this would be good to have ?

It seems like, in the short term at least, the "worse is better" solution to this problem is for content providers to provide links to resources at different quality levels, and allow users to choose the most appropriate resource based on their internet connection and their computer rather than having the computer try to work it out for them. Assuming that the majority of users use a relatively small number of sites with the resources to provide multiple-quality versions of their videos and use a small number of computing devices with roughly unchanging network conditions (I imagine this scenario applies to the majority of non-technical), they will quickly learn which versions of the media works best for them on each site. Therefore the burden of this simple approach on end users does not seem to be very high.

Given this, I would prefer automatic quality negotiation be deferred to HTML6.

Reply via email to