Hi all,
 
I am using the Gaussian Random Timer but it makes no sense.
It says that it is all in milliseconds but I do not believe that.
In my opinion 1000 milliseconds is 1 second? right?
 
when I am filling in 1000 milliseconds on Deviation and in Constant Delay it 
should only take a max of 2 second for a certain request (I putted the Gaussian 
Random Timer as a child of every HTTP request ). 
But when I running the script it is taking a lot of more time then the max of 2 
seconds before he will execute the second request.
Am I doing something wrong or is this a bug?
 
Regards,
 
Marcel


This e-mail and any attachment is for authorised use by the intended 
recipient(s) only. It may contain proprietary material, confidential 
information and/or be subject to legal privilege. It should not be copied, 
disclosed to, retained or used by, any other party. If you are not an intended 
recipient then please promptly delete this e-mail and any attachment and all 
copies and inform the sender. Thank you.

Reply via email to