Hi Sam,
 
first thanks, always good to see more dedicated tools to asses latency under 
load, especially tools that are easy to use and do not require the user to 
maintain her/his own dedicated endpoints!
More below in-line, prefixed [SM].
 
 

Gesendet: Mittwoch, 04. November 2020 um 22:30 Uhr
Von: "Sam Westwood" <s...@repeaterstore.com>
An: bloat@lists.bufferbloat.net
Betreff: [Bloat] We built a new bufferbloat test and keen for feedback

Hi everyone, 
My name is Sam and I'm the co-founder and COO of Waveform.com. At Waveform we 
provide equipment to help improve cell phone service, and being in the industry 
we've always been interested in all aspects of network connectivity. 
Bufferbloat for us has always been interesting, and while there are a few tests 
out there we never found one that was fantastic. So we thought we'd try and 
build one!
My colleague Arshan has built the test, which we based upon the Cloudflare 
Speedtest template that was discussed earlier in the summer in a previous 
thread.
 
We measure bufferbloat under two conditions: when downlink is saturated and 
when uplink is saturated.
 
        [SM] This is a decent starting point. In addition it might be helpful 
to at least optionally include a test with with bidirectional saturating load, 
in the past such tests typically were quite successful in detecting bufferbloat 
sources, that were less obvious in the uni-directional load tests. I am not 
sure however how well that can work with a browser based test?
 
 
The test involves three stages: Unloaded, Downlink Saturated, and Uplink 
Saturated. In the first stage we simply measure latency to a file hosted on a 
CDN. This is usually around 5ms and could vary a bit based on the user's 
location. We use the webTiming API to find the time-to-first-byte, and consider 
that as the latency. 

        [SM] Mmmh, I like that this is a relevant latency measure, it might 
make sense though to make sure users realize that this is not the eqivalent 
number to runing a ICMP eche request against the same endpoint?


In the second stage we run a download, while simultaneously measuring latency. 
In the third stage we do the same but for upload. Both download and upload 
usually take around 5 seconds. 

        [SM] On heavily bufferbloated links it often takes a considerable 
amount of time for the bottleneck buffers to drain after a uni-directional 
test, so it might make sense to separate the two direction test with an 
additional phase of idle latency measurements. If that latency is like the 
initial unloaded latency, all is well, but if latency slowly ramps down in that 
phase you have a smoking gun for bad bufferbloat.
         Also, there are link technologies and scheduler techniques that can 
prioritise relative short flows (e.g. Comcast's powerboost) to avoid just 
measuring the properties of these short duration special modes, it might make 
sense to optionally and considerably lengthen the duration of the test 
durations to say 30 seconds (empirically powerboost does not engage for a full 
30second perid at full rate, but that might be arms race). Also to assess 
possible root causes for latency and rate issues, it is very helpful to show 
time resolved plots, that show the development of rate and latency over the 
duration of all phases of the test. For example, using longer running flent 
tests I could pinpoint the cyclic channel scanning of my laptop's wifi as a 
source of repeated bufferbloat with a period of ~10 seconds, by seeing evenly 
spaced latency spikes and rate dips every 10 seconds then went away when 
switching to wired ethernet...

We show the median, first quartile and the third quartile on distribution 
charts corresponding to each stage to provide a visual representation of the 
latency variations. For download and upload we have used Cloudflare's speedtest 
backend.

        [SM] This is great, it would be nice though to also add a graphical 
representation, be it a histogram or a cumulative density plot of latencies 
(split out for idle, download, upload and the idle period between down- and 
upload).


Best Regards
        Sebastian


 
You can find the test here: https://www.waveform.com/apps/dev-arshan
 
We built testing it on Chrome, but it works on Firefox and mobile too. On 
mobile results may be a little different, as the APIs aren't available and so 
instead we implemented a more manual method, which can be a little noisier.
 
This is a really early alpha, and so we are keen to get any and all feedback 
you have :-). Things that we would particularly like feedback on:
How does the bufferbloat measure compare to other tests you may have run on the 
same connection (e.g. dslreports, fast.com[http://fast.com])How the throughput 
results (download/upload/latency) look compared to other toolsAny feedback on 
the user interface of the test itself? We know that before releasing more 
widely we will put more effort into explaining bufferbloat than we have so far. 
 Anything else you would like to give feedback on?We have added a feature to 
share results via a URL, so please feel free to share these if you have 
specific feedback. 
Thanks!
Sam and Arshan
 
*************************
Sam Westwood
Co-Founder & COO | RSRF & Waveform
E   s...@waveform.com[mailto:s...@waveform.com]
D   (949) 207-3175  
T   1-800-761-3041 Ext. 100
W   www.rsrf.com[http://www.rsrf.com] & 
www.waveform.com[http://www.waveform.com]

 _______________________________________________ Bloat mailing list 
Bloat@lists.bufferbloat.net 
https://lists.bufferbloat.net/listinfo/bloat[https://lists.bufferbloat.net/listinfo/bloat]
_______________________________________________
Bloat mailing list
Bloat@lists.bufferbloat.net
https://lists.bufferbloat.net/listinfo/bloat

Reply via email to