Hi All,

we have a fairly simple test that logs in to our application.

We've setup a Gaussian Random Timer and are monitoring the results in an 
Aggregate report.

My question is about the throughput - if we reduce the delays in the Timer, the 
time taken to log in to the application increases (which makes sense as there 
is more load on the server).

Why is the throughput also increasing? If each request is taking much longer 
(ca. 10 times as long) when we increase the load - shouldn't the throughput be 
lower?

Some numbers to illustrate:
Timer ave 30 sec / deviation 15 sec  => Average request  700ms, max   2760ms, 
throughput 3.1/sec
Timer ave 10 sec / deviation  3 sec  => Average request 9610ms, max 114300ms, 
throughput 5.0/sec

Can someone please explain how this is possible?


Mit freundlichen Grüßen
Jörg Godau

SCHÜTZE Consulting Informationssysteme GmbH Argentinische Allee 22b
14163 Berlin
Tel.: 030/ 802 49 44
Fax: 030/ 8090 39 95
www.schuetze-berlin.de

Geschäftsführer: Klaus-Dieter Schütze
Registergericht: Amtsgericht Charlottenburg
Registernummer: HRB 73618
Umsatzsteuer-Identifikationsnummer gemäß § 27a Umsatzsteuergesetz: DE 813181239




---------------------------------------------------------------------
To unsubscribe, e-mail: jmeter-user-unsubscr...@jakarta.apache.org
For additional commands, e-mail: jmeter-user-h...@jakarta.apache.org

Reply via email to