On Thu, Mar 8, 2012 at 3:48 AM, Jason Duell <[email protected]> wrote:
> On 03/07/2012 04:10 PM, Robert O'Callahan wrote:
>
> On Wed, Mar 7, 2012 at 1:55 PM, Ashwin Rao <[email protected]>
> wrote:
>>
>> > Our media cache already calls suspend/resume "as needed" to throttle
>> > downloading when the cache fills up. It is a blunt hammer :-).
>> >
>>
>> The media cache size is 500MB. If your video is smaller than that, you
>> won't see any throttling.
>>
>>
>> Try setting media.cache_size to say 50MB, preload a 200MB video, you
>> should see downloading pause after approximately 50MB has been loaded. Then
>> start playing, and eventually you should see downloading resume; the
>> download will pause and resume to keep a window of data ahead of the play
>> point.
>>
>> (I wouldn't call this "rate throttling", since it's not explicitly based
>> on rate.)
>
>
> Necko internally would be doing rate throttling using the same
> suspend/resume mechanism--that's all TCP gives us.  So it's not so much a
> less-blunt hammer than one that we can swing more quickly (i.e the socket
> transport thread can keep track of the bandwidth coming in and throttle it
> w/o the overhead/noise of thread events being sent back and forth to the
> main thread from the video cache).  I'm not sure how much difference that
> makes in practice, though--there's a good chance it'd be smoother .  AFAICT
> this avoidance of event latency, plus a better sense of "bottleneck
> bandwidth" are the only 2 advantages of doing this in necko.    That might
> be reason enough, or it might make sense to do a 1st version of this using
> suspend/resume from the media cache with a <500MB buffer.
>

I agree that the buffer size should be less than 500 MB. A buffer size
of 10 MB would be large enough for most videos; 10 MB = 80 Mbits = 80
seconds of playback data for a video encoded at 1 Mbps. For HD videos
that have an encoding rate of 5 Mbps it would account for 16 seconds
of playback time. A  double buffering scheme where download
resumes/begins when the buffered amount falls below 10 MB and download
pauses when the buffer size is 20 MB could be tried. I need to re-look
at the typical encoding rates used by YouTube, NetFlix, Vimeo, and
Dailymotion. I do not have an idea on the video encoding rates used by
other video streaming services.

>> Patrick wrote:
>>
>> The problem isn't the buffered video but the IP level buffering that
>> happens on big tcp downloads...  that's why I filed it against networking
>> first
>
> I don't follow--the OS buffers for a TCP socket are much smaller than the
> media cache's buffer.   Or are you talking about clogging router buffers?  I
> still don't see how necko doing the suspend/resume vs the media cache makes
> a difference here.
>

The problem is that TCP tries to saturate buffers at the routers till
a packet drop is encountered. On a packet drop it reduces the rate of
sending packets. The rate is slowly (in case of Reno) increased until
the next packet drop. In the case of home gateways the buffer size are
large. One reason for large buffers is to support a burst of packets,
however the side-effect of these large buffers is that the queuing
delay (in the steady state) at the home gateways tends to exceed the
propagation delay. This can reduce the performance and responsiveness
of TCP flows.

TCP flows transferring streaming video content do not need to send at
the end to end available bandwidth -- they can send at a rate that is
close to the video encoding rate. This can ensure that the TCP flows
transferring video content have a smaller footprint on the router
buffers.

Another important advantage of downloading at the reduced rate is that
the amount of unused bytes -- the bytes downloaded by Firefox but not
used by the player due to user interruption -- is kept to a minimum.
This can ensure that the memory (media cache, memory, and disk)
footprint of Firefox is small. In my case, I see the amount of unused
bytes as the bytes wasted which in turn reflects the wastage of
networking resources that were used to transfer the bytes.

Regards,
Ashwin



> Jason
>
>
>
>>
>> A work by Don Towsley suggests that the rate of up to 2 times the
>> median encoding rate is sufficient for smooth playback [
>> http://dl.acm.org/citation.cfm?id=1027735 ]. About the buffering I did
>> some measurements  on YouTube where I observed that YouTube begins a
>> streaming session (for Flash videos) by buffering 40 seconds of
>> playback data before limiting the download rate to 1.25 times the
>> video encoding rate. The details of the results are available at [
>> http://hal.inria.fr/inria-00638063/en/ ].
>
>
> Either of those could be implemented in Gecko pretty easily I guess,
> although as I said before, I'm not sure pausing and resuming the Necko
> download is adequate to hit a smooth target rate.
>
>
>
_______________________________________________
dev-tech-network mailing list
[email protected]
https://lists.mozilla.org/listinfo/dev-tech-network

Reply via email to