And the time each story takes to complete is reported in the reports
view.
If on the other hand you want to have some more fine grained stop-watch
metrics (e.g. at scenario level) you'll need to implement them
yourself.
On Fri Oct 21 11:02:14 2011, Mauro Talevi wrote:
Hi Seth,
JBehave already supports the concept of timeout for a given story. You
can specify the timeout in secs via the Embedder, Maven or Ant.
And yes, JBehave can certainly be used for performance testing.
Cheers
On 21/10/2011 02:16, Seth Carter wrote:
in another life I worked on a python based test framework using
pyunit and pyunitperf and was able to add a test and run it normally
for pass/fail based on business logic, if that test lent itself
nicely for performance I could run it (same test) inside a timed
wrapper for pass/fail based on a time limit. Further, I could supply
tolerance time, number of users, number of iterations/user, and a
delay between iterations/test/user. I'm now working with a series of
java apps and (of course) jbehave and have come to the same question
from a developer:
"Well what if I want to make sure my test(scenario) runs in under x
seconds?"
This reminded me of the problem solved with pyUnitPerf. I liked the
idea of writing the test once with the possibility to time, or load
and time it. pyUnitPerf is a port of jUnitPerf so I figured the same
is possible with a jbehave test as it is based on junit?
The real problem is my java skills are laughable (but improving), I'm
wondering if anyone has fooled around with this?
Also wondering if there is some performance module in jbehave that I
have completely missed.
Thanks,
Seth
---------------------------------------------------------------------
To unsubscribe from this list, please visit:
http://xircles.codehaus.org/manage_email
---------------------------------------------------------------------
To unsubscribe from this list, please visit:
http://xircles.codehaus.org/manage_email