Bernhard Huber wrote:
>
> Hi
> variance of between what?
> Between 1,503 and 1,502 I don't know.
>
> 1)Testcase: testSizeRotationUniqueFilename took 1,503 sec
> 1)Testcase: testSizeRotationRevolingFilename took 1,502 sec
> 2)Testcase: testTimeRotationUniqueFilename took 10,004 sec
> 2)Testcase: testTimeRotationRevolvingFilename took 10,005 sec
Before we worry about 1 or 2 second variations, I want to take the
time to comment about performance benchmarking. A benchmark is only
useful when it documents consistent trends in your algorithm. It is
not good for absolute numbers, as they have to be normalized in some
way (usually to a reference system).
Benchmarking results can be affected by the underlying operating
system, as well as the Java Virtual Machine. It could be that you
saw more garbage collection cycles in one test than another, accounting
for the cumulative second or two. It could be that the test algorithm
is sufficiently different to consistently add a second to the test
time between the different testcases. Lastly, it could be the
implementation that is the cause of the delay.
With the numbers we are talking about (1503 and 1502 seconds), it is
only a 0.06% increase in time. The other test (10004 and 10005 sec),
it is only a 0.01% increase in time. Hardly worth spending a lot of
time on it.
> Between 1,503 and 10,004, and
> between 1,502 and 10,005 is due to writing 10 time more
> data in the later case.
This satisfactorily accounts for the fact that it takes 6.7 times longer.
> Just one point, in the revolving case first a unique-filename
> is written, as openFile() is called to early.
> I have not fixed that, yet, perhaps I can do it before the weekend...
>
> I'm doing some more tests as you asked me about variance,
> and now I have following results:
> Testsuite: org.apache.log.output.test.TestRotatingFileOutputLogTarget
> Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 33,489 sec
>
> ------------- Standard Output ---------------
> File(s) testSizeRotationUniqueFilename.log written: 2270065
> File(s) testSizeRotationRevolingFilename.log written: 2271565
> File(s) testTimeRotationUniqueFilename.log written: 15089260
> File(s) testTimeRotationRevolingFilename.log written: 14864390
> File(s) testTimeRotationRevolvingFilenameCycles.log written: 15181155
> ------------- ---------------- ---------------
> Testcase: testSizeRotationUniqueFilename took 1,863 sec
> Testcase: testSizeRotationRevolingFilename took 1,602 sec
> Testcase: testTimeRotationUniqueFilename took 10,005 sec
> Testcase: testTimeRotationRevolvingFilename took 10,004 sec
> Testcase: testTimeRotationRevolvingFilenameCycles took 10,005 sec
>
> What's strange the diff between 1,863 and 1,602
That's a 16% difference, and categorizes now as significant. Was
anything else running in the background when you ran the test? Was
that result consistent?
I realize these take a long time to run, but sometimes the tests
are more meaningful in averages (i.e. run 10 or more times with the
results from each run averaged together).
I also noticed that you have different ammounts of data written in
each test. In order to appropriately test you need to make wither
time constant (run until a set time expires), or the data written
constant (run until the defined set of data is written).
<snip-the-old-stuff/>
---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]