The whole point of me trying to verify the JMeter throughput
calculations is to use the same formula in Excel or some other program.
 

If what you say below is true, then that is not what I'm seeing.  I may
be looking at the wrong code but this is what I'm seeing. I am using
version 2.0 and looking at that source code:

In SampleResult.java, the constructor is called when a line of results
is read from the jtl file.  If properties not found, default is end
timestamp according to code. 
public SampleResult(long stamp, long elapsed)
{
    // Maintain the timestamp relationships
    if (startTimeStamp) 
    {
        setTimes(stamp, stamp + elapsed);
    } 
    else 
    {
        setTimes(stamp - elapsed, stamp);
    }
}

Then a SampleResult object is added to the RunningSample object for
that request type.  
public synchronized void addSample(SampleResult res)
{
    long aTimeInMillis = res.getTime();
    boolean aSuccessFlag = res.isSuccessful();

    counter++;
    long startTime = res.getTimeStamp() - aTimeInMillis;
    if (firstTime > startTime)
    {
        // this is our first sample, set the start time to current
        // timestamp
        firstTime = startTime;
    }
    if(lastTime < res.getTimeStamp())
    {
        lastTime = res.getTimeStamp();
    }
    runningSum += aTimeInMillis;        
    if (aTimeInMillis > max)
    {
        max = aTimeInMillis;
    }        
    if (aTimeInMillis < min)
    {
        min = aTimeInMillis;
    }        
    if (!aSuccessFlag)
    {
        errorCount++;
    } 
}

In this statement
long startTime = res.getTimeStamp() - aTimeInMillis;
The code is assuming that the timestamp is always the end timestamp.
But the timestamp can be a start and doing the elapsed time subtraction
is setting the start timestamp to an earlier start.  Am I wrong?  So in
my original email, this is the behavior I am seeing for the start time.





--- Michael Stover <[EMAIL PROTECTED] wrote:
I thought we were talking about JMeter's throughput calculation?  In
JMeter's calculation, if timestamps are at the beginning, then the
ending time takes the sample time of the last sample into account. 
If
the timestamps are at the end of the samples, then the beginning time
takes into account the sample time of the first sample.  What you do
in
excel is up to you.

-Mike

On Fri, 2004-06-04 at 16:05, Remedy QA wrote:
Well, I was just trying to verify the calculations and wanted a
smaller
results file to handle.  But I did run it longer and the error
margins
get less significant.

What do you mean the cvs code accounts for the time of the sample
regardless of start or end timestamps?  If I were to export the
results
file to something like Excel and do the throughput calculations in
there, would the timestamps be accurate?

--- Michael Stover <[EMAIL PROTECTED]:
If you're looking for throughput numbers, you should be running
your
test for a longer time than 40 ms.  Try running for 30 minutes
and
then
see how much the error is.

In any case, the code in cvs accounts for the time of the sample,
whether your timestamp is at the start or end of the sample.

-Mike

On Thu, 2004-06-03 at 19:15, Remedy QA wrote:
I am a bit confused as to how JMeter calculates the
throughput of each type of HTTP Request.

My jmeter.properties setting has the default 
sampleresult.timestamp.start=true
which is suppose to log the timestamp of when a HTTP
Request starts in the .JTL file.

So for example, I have the following results in csv
format
TIMESTAMP|TIME|LABEL|RESPONSE CODE|RESPONSE MSG|THREAD
NAME|DATATYPE|SUCCESS
1086285141198|10|Request 1|200|OK|Thread
Group1-2|text|true
1086285141248|40|Request 2|200|OK|Thread
Group1-2|text|true
1086285141258|20|Request 1|200|OK|Thread
Group1-1|text|true
1086285141309|51|Request 2|200|OK|Thread
Group1-1|text|true

JMeter calculates the throughput per second as
(count/howLongRunning) * 1000.0

The part where I'm confused about is the
howLongRunning time.  The source code says that the
total time passed is taken to be the timestamps of the
first and last samples.  The timestamps are suppose to
be the start time of each request, according to the
jmeter properties configuration. 

So for Request 2, the last sample timestamp is 41309
while the timestamp for the first sample is 41248. 
JMeter calculates by 
(2 / (41309 - (41248 - 40))) * 1000.0

Why does it take the first sample and subtracts the
response time that it took to execute the first
Request 2?  

I would think that instead, it should add the response
time of the last Request 2 to the Request 2 timestamp
(41309 + 51). That would actually be the time passed
from the first Request 2 to start and really end with
the last Request 2.

Is the timestamp really the start time of the request?


thanks,
mabel




        
                
__________________________________
Do you Yahoo!?
Friends.  Fun.  Try the all-new Yahoo! Messenger.
http://messenger.yahoo.com/ 

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to