On Mon, 09 Feb 2009 22:13:09 +0100, Jeremy Doig <jerem...@google.com> wrote:

Measuring the rate at which the playback buffer is filling/emptying gives a fair indication of network goodput, but there does not appear to be a way to
measure just how well the client is playing the video itself. If I have a
wimpy machine behind a fat network connection, you may flood me with HD that I just can't play very well. The cpu or video card may just not be able to render the video well.Exposing a metric (eg: Dropped Frame count, rendered frame rate) would allow sites to dynamically adjust the video which is being
sent to a client [eg: switch the url to a differently encoded file] and
thereby optimize the playback experience.
Anyone else think this would be good to have ?

While I think this kind of thing might be useful, I would be careful about requiring any kind of detailed metrics like dropped frames or effective frame rate to be exposed via DOM, as getting this information reliably over different devices, platforms and media frameworks would be quite difficult. How about an event which the user agent can optionally fire to indicate that it cannot play at the requested rate due to processing/memory constraints (rather than network)? This would (I think) provide the same functionality but put less burden on implementors.

There is already a placeholder for non-fatal errors in the spec, perhaps this could be included with that in some fashion?

--
Philip Jägenstedt
Opera Software

Reply via email to