Hi Mikhail, 

>  When we run the tests (those ones that we call unit tests :) in 
performance mode
> we start a timer, then run the same testcase in a loop millions of
> times (and we do NOT call them via reflection) then we stop a timer

Here is a flavour of what works for a lot of people...
 
Use the JUnit type RepeatedTest (see [1]) to wrap any concrete JUnit Test 
(e.g. a TestSuite, a TestCase etc) so that the test (or tests if it is a 
TestSuite) get run as many million times as you want. Then, take the very 
same RepeatedTest object and wrap that inside a JUnitPerf TimedTest (see 
[2]), also passing in a maximum elapsed time that the RepeatedTest has to 
run in. If the RepeatedTest (which is running the original test case or 
suite millions of times for you) doesn't complete in the specified maximum 
time then the test is a failure. 


> As I understand your point, you are going to measure a single call of
> a test method and that call is over reflection, or multiple calls but
> each of them over reflection.

No, that was not my point. My point is that, at the unit test level, it is 
possible to carry out performance timings using *established* JUnit 
techniques that are *dynamic* in nature and do *not* require adding in a 
superclass into the inheritance hierarchy meaning that every child class 
"is a" performance test spouting out lots of "== test Foo passed OK ==" 
messages. What happens if you want to carry out some stress testing with 
the unit tests ? Does it mean the addition of a new superclass at the top 
of the hierarchy so that every unit test class "is a" stress test as well 
? And so on for interop tests ... etc .. etc ... ?


> Just wanted to say that reflection part of the call might take 99% of 
method
> execution and results would not be reliable. Please correct me if I 
wrong.

You could adjust your target timings to take into account the effects of 
the test framework. What would be the problem in doing that ? If you are 
looking for the introduction of performance problems then isn't it the 
relative timings (e.g. "before versus after the introduction of new code" 
or "test on JRE1 versus identical test on JRE2") that matter ? 


Best regards, 
George

[1] 
http://www.junit.org/junit/javadoc/3.8.1/junit/extensions/RepeatedTest.html
[2] http://clarkware.com/software/JUnitPerf.html#howtouse
________________________________________
George C. Harley





Mikhail Loenko <[EMAIL PROTECTED]> 
18/01/2006 17:06
Please respond to
harmony-dev@incubator.apache.org


To
harmony-dev@incubator.apache.org
cc

Subject
Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests 
into a decorator class.






Hi George

When we run the tests (those ones that we call unit tests :) in 
performance mode
we start a timer, then run the same testcase in a loop millions of
times (and we do
NOT call them via reflection) then we stop a timer

As I understand your point, you are going to measure a single call of
a test method and that call is over reflection, or multiple calls but
each of them over reflection.
Just wanted to say that reflection part of the call might take 99% of 
method
execution and results would not be reliable. Please correct me if I wrong.

Thanks,
Mikhail

On 1/18/06, George Harley1 <[EMAIL PROTECTED]> wrote:
> Hi Mikhail,
>
> > The messages are important to analyze failures also.
>
> What is JUnit's own failure reporting mechanism not providing to you ?
>
>
> > And the possibility to test perfromance is also important
>
> Yes it is. But - to return to the original point of the HARMONY-31 JIRA
> issue - this can be done without the need to either bring in a 
superclass
> in the test hierarchy and/or scatter logging calls around the test code.
> Performance testing using JUnit should be done in a transparent manner
> using a decorator. As an example of what I mean please take a look at 
the
> JUnitPerf site at http://clarkware.com/software/JUnitPerf.html . 
JUnitPerf
> is an extension to JUnit that helps with the creation of performance 
tests
> that match existing unit tests. It does this using a decorator design.
> This provides for a separation of concerns that benefits developers and
> performance engineers.
>
> The decorator approach means that we effectively *extend* JUnit with
> simple wrapper classes in which we can make full use of the JUnit API to
> give us the additional behaviour needed when running the tests (maybe 
this
> could be used to give whatever extra failure analysis data you say is
> lacking ?). And if somebody doesn't want the extension behaviour ? They
> just run the suite without the custom wrappers.
>
>
> > When we have a powerful performance suite we may revisit this.
>
> Don't you mean when we have a powerful unit test suite ?
>
> Take care,
> George
> ________________________________________
> George C. Harley
>
>
>
>
>
> Mikhail Loenko <[EMAIL PROTECTED]>
> 18/01/2006 12:57
> Please respond to
> harmony-dev@incubator.apache.org
>
>
> To
> harmony-dev@incubator.apache.org
> cc
>
> Subject
> Re: [jira] Commented: (HARMONY-31) Move peformance timing of unit tests
> into a decorator class.
>
>
>
>
>
>
> Well, those messages were important for developers when
> they were writing code and tests. Then tests came to repository 'as is'.
>
> The messages are important to analyze failures also.
> And the possibility to test perfromance is also important
>
> For me any option that does not break functionality in favor of beauty
> looks
> good. I suggest swithcing off logging by default and ether keep
> PerformanceTest
> super class as is or replace it with a simple Logger class.
>
> When we have a powerful performance suite we may revisit this.
>
> Thanks,
> Mikhail.
>
> On 1/18/06, Tim Ellison <[EMAIL PROTECTED]> wrote:
> > Absolutely right -- writing meaningful performance tests is hard.
> > Implementing your own Logger would not solve the problem though<g>.
> >
> > Best to avoid the 'This test worked OK' log messages altogether, and
> > stick to assertions.
> >
> > Regards,
> > Tim
> >
> > Mikhail Loenko wrote:
> > > It might be a problem...
> > >
> > > When we use java.util.logging we do not just compare performance of
> security
> > > API functions, the result is also depends on difference in 
performance
> of
> > > java.util.logging in standard classes vs. Harmony classes. So if we
> use
> > > non-trivial functionality from there then our results will be 
spoiled
> a little.
> > >
> > > Will investigate more...
> > >
> > > Thanks,
> > > Mikhail.
> > >
> > > On 1/17/06, Tim Ellison <[EMAIL PROTECTED]> wrote:
> > >> neither is the Logger class -- so my point is if you are going to
> write
> > >> some logging code why not do it in java.util.logging?  You may 
choose
> to
> > >> only do simple stubs for now until somebody steps up to do a real
> impl.
> > >>
> > >> Regards,
> > >> Tim
> > >>
> > >> Mikhail Loenko wrote:
> > >>> It's not yet implemented.
> > >>>
> > >>> thanks,
> > >>> Mikhail
> > >>>
> > >>> On 1/17/06, Tim Ellison <[EMAIL PROTECTED]> wrote:
> > >>>> Why not use java.util.logging?
> > >>>>
> > >>>> Regards,
> > >>>> Tim
> > >>>>
> > >>>> Mikhail Loenko (JIRA) wrote:
> > >>>>>     [
> 
http://issues.apache.org/jira/browse/HARMONY-31?page=comments#action_12362910

> ]
> > >>>>>
> > >>>>> Mikhail Loenko commented on HARMONY-31:
> > >>>>> ---------------------------------------
> > >>>>>
> > >>>>> This is not what I meant.
> > >>>>>
> > >>>>> I was going to create a Logger class at this point like this:
> > >>>>>
> > >>>>> public class Logger {
> > >>>>>         public static boolean printAllowed = false;
> > >>>>>       public static void log(String message) {
> > >>>>>               if (printAllowed) System.out.print(message);
> > >>>>>       }
> > >>>>>       public static void logln(String message) {
> > >>>>>               if (printAllowed) System.out.println(message);
> > >>>>>       }
> > >>>>>       public static void logError(String message) {
> > >>>>>               if (printAllowed) System.err.print(message);
> > >>>>>       }
> > >>>>>       public static void loglnError(String message) {
> > >>>>>               if (printAllowed) System.err.println(message);
> > >>>>>       }
> > >>>>> }
> > >>>>>
> > >>>>> And replace log() with Logger.log() everywhere in the tests.
> > >>>>>
> > >>>>> All the remaining functionality in the PerformanceTest is
> obsolete.
> > >>>>>
> > >>>>>
> > >>>>>> Move peformance timing of unit tests into a decorator class.
> > >>>>>> ------------------------------------------------------------
> > >>>>>>
> > >>>>>>          Key: HARMONY-31
> > >>>>>>          URL: http://issues.apache.org/jira/browse/HARMONY-31
> > >>>>>>      Project: Harmony
> > >>>>>>         Type: Improvement
> > >>>>>>     Reporter: George Harley
> > >>>>>>     Assignee: Geir Magnusson Jr
> > >>>>>>     Priority: Minor
> > >>>>>>  Attachments: PerfDecorator.java
> > >>>>>>
> > >>>>>> There has been some low-level discussion on the dev mailing 
list
> recently about the inclusion of performance-related logging code near 
the
> top of a unit test class inheritance hierarchy (see
> com.openintel.drl.security.test.PerformanceTest in the HARMONY-16
> contribution). This particular issue suggests an alternative way of 
adding
> in timing code but without making it the responsibility of the unit 
tests
> themselves and without the need to introduce a class in the inheritance
> hierarchy.
> > >>>>>> The basic approach is to exploit the
> junit.extensions.TestDecorator type in the JUnit API to add in timing
> behaviour before and after each test method runs. This will be
> demonstrated with some simple sample code.
> > >>>> --
> > >>>>
> > >>>> Tim Ellison ([EMAIL PROTECTED])
> > >>>> IBM Java technology centre, UK.
> > >>>>
> > >> --
> > >>
> > >> Tim Ellison ([EMAIL PROTECTED])
> > >> IBM Java technology centre, UK.
> > >>
> > >
> >
> > --
> >
> > Tim Ellison ([EMAIL PROTECTED])
> > IBM Java technology centre, UK.
> >
>
>
>


Reply via email to