Measuring the rate at which the playback buffer is filling/emptying gives a
fair indication of network goodput, but there does not appear to be a way to
measure just how well the client is playing the video itself. If I have a
wimpy machine behind a fat network connection, you may flood me with HD that
I just can't play very well. The cpu or video card may just not be able to
render the video well.Exposing a metric (eg: Dropped Frame count, rendered
frame rate) would allow sites to dynamically adjust the video which is being
sent to a client [eg: switch the url to a differently encoded file] and
thereby optimize the playback experience.
Anyone else think this would be good to have ?

Thanks,
Jeremy

Reply via email to