Hi Cansu,

Thanks for the quick response. 

I was looking for a client-side metric that is representative of the 
media-server performance. A metric that measure responsiveness or latency 
rather than throughput. The reason why I ask is there is some network latency 
that is involved between server and the client and the server-side metric (I am 
assuming) don't capture this. 

You did not mention about using the response time in the summary.xml that we 
get at the end of a run. Is there any issue with those metrics (For instance, I 
always find 99th percentile response time as a constant number for every run)?  
Further, I constantly find the benchmark/driver passed value "false" in my 
result (esp. for response time). I did make sure that the ramp down time is 
2*times as much as the longest video requested (I am requesting shortest video 
of all mix).

Thanks!
Venkat

On Nov 4, 2013, at 5:54 PM, Cansu Kaynak <[email protected]> wrote:

> Dear Venkat,
> First of all, thanks for the feedback. We have some information about 
> performance and QoS at the end of the Media Streaming page. However, it looks 
> like it is not sufficient/clear. So, we’ll definitely enhance it.
> 
> Regarding the performance metric (some of these are already explained on the 
> web page), we have a throughput and a QoS metric for Media Streaming and both 
> are displayed on the server side (if you run the server with the command 
> specified on the web page). You can compare the server throughputs by 
> comparing the RTSP-Conns or kBits/Sec fields (they are proportional) when the 
> server is running at a steady state and while the QoS requirements are met 
> (if you keep the dataset and request (video) mix constant across runs).  To 
> make sure that we ensure QoS, we make sure that AvgDelay output by the server 
> is less than 0. We came up with this value by gradually saturating the server 
> and making sure that an additional client can still stream a video without 
> any interruptions. This is not a perfect QoS metric, but it is an estimate to 
> avoid oversaturating the server.  
> 
> Hope this helps.
> Please let us know if you have more questions and/or suggestions.
> 
> --
> Cansu
> 
> On 04 Nov 2013, at 21:10, Venkatanathan Varadarajan <[email protected]> 
> wrote:
> 
>> Hi all,
>> 
>> I was wondering if anyone could explain the results of the media-streaming 
>> benchmark. The documentation on the website does not talk about this.
>> 
>> 1. The response time are in seconds. Is this the total time taken to stream 
>> a complete video or time between sending request and receiving a response 
>> from the server, at the client-side?
>> 2. Similarly, what does it mean by "ops/sec". What are operations here? 
>> Streaming a video?
>> 
>> The characteristic of performance of a media-streaming server at the 
>> client-side, I think, are average latency of each frame (audio or video) or 
>> frame transmission rate and % of user-perceivable stream 
>> disruptions/violation, or something similar.
>> 
>> Is there an application-level metric in the result summary that is 
>> characteristic of the media-streaming benchmark? Or, what should be used as 
>> a metric for the media-streaming benchmark? Am I missing something?
>> 
>> One general comment about the cloudsuite 2.0 benchmark suite, the 
>> documentation fails to explains the results/metrics for some of the 
>> benchmarks. A benchmark is incomplete without a defined/standardized metric 
>> that could be used to compare between different runs of the same benchmark. 
>> It would be great if there are sections for each benchmark in the website 
>> that also explains the metric that should be used for each.
>> 
>> Thanks,
>> Venkat
>> 
>> -- 
>> Venkatanathan Varadarajan (Venkat),
>> Graduate Student,
>> University of Wisconsin-Madison.
>> 
> 

Reply via email to