Thing is, you seem to have the test set to run as fast as it can, without any
kind of pacing. In this situation it is entirely possible that some machines
are faster than others or that running locally vs. running in client/server
mode is also different.

But this doesn't actually matter. If you configure your test plan to
generate a defined, consistent load profile then all you have to do is make
sure that each machine that you use is capable of generating this load
profile. Sure, some might be able to do more, but that is irrelevant, all
that matters is that each server can deliver what is needed.

I do this: For each engagement, I establish a target profile that I need to
prove is possible. I then design a test using Constant Throughput Timers
that delivers this load - usually I split the load over multiple machines so
I might design the test to hit 10% of the target and then run it over 10
boxes. This way I am happy that the test I run is realistic and useful and I
can worry about other things, like tuning the system I'm testing.



--
View this message in context: 
http://jmeter.512774.n5.nabble.com/Jmeter-Performance-using-jmeter-server-VS-running-in-the-local-instance-tp4631144p4631368.html
Sent from the JMeter - User mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: jmeter-user-unsubscr...@jakarta.apache.org
For additional commands, e-mail: jmeter-user-h...@jakarta.apache.org

Reply via email to