The difference appears to be about 10 seconds between the clock on my
machine and the slave server.  I added a constant timer and that made no
difference.

Do the two machines really have to be set down to the exact second?

I would think we are measuring the delta between start and stop on the same
machine, so the clocks should not matter.

Thanks,

Carl

On 10/20/09 1:06 PM, "Deepak Shetty" <shet...@gmail.com> wrote:

> are the time clocks on both machines in sync?
> 
> On Tue, Oct 20, 2009 at 11:02 AM, Carl Shaulis <cshau...@homeaway.com>wrote:
> 
>> Hello,
>> 
>> We have recently set up a distributed JMeter environment.  I am using my
>> MacBook Pro as the Master and a Linux machine as the slave.  I executed a
>> very simple test for 5 minutes, where 500 concurrent users access a static
>> html page.  The results showed an average response time of 0 ms.  Looking
>> more closely at the data there are numerous transactions that look like
>> this.
>> 
>> Thread Name: SorryPageTest 1-97
>> Sample Start: 2009-10-20 12:42:29 CDT
>> Load time: -897
>> Latency: -897
>> Size in bytes: 1723
>> Sample Count: 1
>> Error Count: 0
>> Response code: 200
>> Response message: OK
>> 
>> How can you get a negative load time and negative latency with a 200
>> response code?
>> 
>> Help!
>> 
>> Carl
>> 


---------------------------------------------------------------------
To unsubscribe, e-mail: jmeter-user-unsubscr...@jakarta.apache.org
For additional commands, e-mail: jmeter-user-h...@jakarta.apache.org

Reply via email to