Can I ask a dumb question?

>>    - We run each test case 8 times.
   - Remove the first round result, and in the rest of 7 round results,
   remove the minimum and maximum result

Naturally the first run will be loading all the libraries, dll,
dependencies, etc into memory and be significantly different.

So I understand why you would not include that run in the other tests. But
what about first run data in and of itself?

If OO takes a long time to start on first run, isn't that an issue?

Cheers

Colin

On Tue, Mar 26, 2013 at 2:26 PM, Li Feng Wang <phoenix.wan...@gmail.com>wrote:

> (1)For PVT report you can record as wiki on
> http://wiki.openoffice.org/wiki/QA/Report
>
> *  GUI PVT:*
>     1)open pvt.gui.Benchmark.xml with OO
>     2)compute test result
>
>    - We run each test case 8 times.
>    - Remove the first round result, and in the rest of 7 round results,
>    remove the minimum and maximum result
>    - Compute the average and standard deviation of the rest 5
> results.* *Note:the
>    standard deviation is to reflect whether the 8 round results are stable.
>
> *  UNO PVT:*
>     1)Open pvt.uno.Conversion.xml with OO
>     2)Show your result without compute.
>
>      First of all,  You need run on the AOO 3.4.1 release to establish a
> baseline.  To make your PVT result more meaningful.
>
> (2)For test.uno.project, you need follow
> http://wiki.openoffice.org/wiki/QA/test_automation_guide<
> http://wiki.openoffice.org/wiki/QA/test_automation_guide>
>     I've ran uno project on Eclipse, it works fine.
>     I will run uno project on command, try to find your env problem.
>
>
> 2013/3/26 Anders Kvibäck <akva1...@gmail.com>
>
> > Hello!
> >
> > I uninstalled both 3.4.1 and 4.0 and language packs. Then I just
> installed
> > 4.0 and now it works again. Sorry , about that.
> >
> > I've done some sporadic testing on the test.gui project just to try it
> out.
> > Basic function test works fine .I  will come come back about that.
> >
> >
> >  But now I can't run the  test.uno project. I get an error message:
> >
> > "The archive: C:/Users/Anders/Desktop/OOo-dev 3.5 (en-US) Installation
> > Files/Basis/program/classes/unoil.jar which is referenced by the
> classpath,
> > does not exist."
> >
> > So, it is something with the settings of my environmental variables
> here, i
> > e classpath. What shall I type in there?
> >
> >
> > I see that in the class MixedTest it can't find the XTextDocument class;
> > I get the error messsages:
> >
> > "Multiple markers at this line
> >     - The method newDocument(String) from the type UnoApp
> >      refers to the missing type XComponent
> >     - XTextDocument cannot be resolved to a type
> >     - UnoRuntime cannot be resolved
> >     - XTextDocument cannot be resolved to a type
> >     - XTextDocument cannot be resolved to a type"
> >
> > Do you have any idea?
> >
> > And; how shall I report the perfomance test?
> >
> > Anders
> >
> >
> >
> >
> > 2013/3/25 Li Feng Wang <phoenix.wan...@gmail.com>
> >
> > > Can you describe your steps more detailed?
> > >
> > > I will try to investigate the issue.
> > >
> > > 2013/3/22 Anders Kvibäck <akva1...@gmail.com>
> > >
> > > > Hi!
> > > >
> > > > I've been trying to run some performance tests and I followed the
> > guide.
> > > I
> > > > started off with the class AOOTest and it workewd fine, so then I did
> > the
> > > > class smoke test and it worked fine plus some more.
> > > >  But now all of a sudden I can't run the tests anymore. I just get
> the
> > > > message:
> > > >
> > > > *java.lang.RuntimeException: Can't connect to automation server!*
> > > >
> > > >
> > > > JUnit starts fine and I get the message in console:
> > > > *
> > > > INFO: Start running test method [smokeTest]*
> > > >
> > > >
> > > >
> > > > but after 30 sec it stops
> > > >
> > > > I do the same tests again but it doesn't work.
> > > >
> > > > I've spent an hour or two on google but I couldn't find anything
> useful
> > > so
> > > > far.
> > > >
> > > > Does anyone have an idea how to solve this?
> > > >
> > > >
> > > >
> > > > Anders
> > > >
> > > >
> > > >
> > > > 2013/2/28 Yi Xuan Liu <liuyixuan....@gmail.com>
> > > >
> > > > > Anders, Thanks for your trial.  Did you follow this guide
> > > > > http://wiki.openoffice.org/wiki/QA/test_automation_guide ?
> > > > >
> > > > > I can't find "Disable color calibration" mentioned in the guide.
> > > > >
> > > > > On Wed, Feb 27, 2013 at 10:49 PM, Anders Kvibäck <
> akva1...@gmail.com
> > >
> > > > > wrote:
> > > > >
> > > > > > Hi!
> > > > > >
> > > > > > Yes, I would like to run performance tests. I've started to look
> > into
> > > > the
> > > > > > link about  automated testing that you reffered to.
> > > > > > There it says that I have to disable color calibration. But is
> this
> > > > > > possible in Windows Vista.? I just can't find out how to do it.
> As
> > > far
> > > > > as I
> > > > > > understand it this is only possible in Windows 7- which I don't
> > have
> > > > > > unfortunately.
> > > > > >
> > > > > > Anders
> > > > > > 2013/2/26 Yi Xuan Liu <liuyixuan....@gmail.com>
> > > > > >
> > > > > > > Rob, I agree with you. The test environment configuration would
> > > > impact
> > > > > > the
> > > > > > > performance test results. Therefore, we should keep the test
> > > > > > configuration
> > > > > > > unchanged between build to build.
> > > > > > >
> > > > > > > I checked performance bugs list in bugzilla. 5 performance bugs
> > are
> > > > > found
> > > > > > > in automation performance test, such as memory leak and save
> > > > > performance
> > > > > > > issue.
> > > > > > >
> > > > > > > Lots of performance bugs are found in AOO daily usage.
> Therefore,
> > > for
> > > > > > > volunteers who are not willing to run automation performance,
> you
> > > can
> > > > > > also
> > > > > > > report bugs for any performance issue and it will be helpful
> for
> > > us.
> > > > > > >
> > > > > > > On Mon, Feb 25, 2013 at 9:16 PM, Rob Weir <robw...@apache.org>
> > > > wrote:
> > > > > > >
> > > > > > > > On Sun, Feb 24, 2013 at 11:42 PM, Yi Xuan Liu <
> > > > > liuyixuan....@gmail.com
> > > > > > >
> > > > > > > > wrote:
> > > > > > > > > Hi, all:
> > > > > > > > >
> > > > > > > > > AOO 4.0 will release. Performance plays an important role
> in
> > > > > software
> > > > > > > > > quality. Is there any volunteer who want to run performance
> > > test?
> > > > > > > > >
> > > > > > > > > I've run AOO performance test on my own machines. I've
> tried
> > > on 3
> > > > > > > > > platforms: Windows XP, Ubuntu 10.04, Mac Mac OS 10.7.3. The
> > > test
> > > > > > > > configures
> > > > > > > > > are as follows:
> > > > > > > > >
> > > > > > > > > (1) W500; CPU:2.53 GHz; Mem: 3GB; OS: XP SP3
> > > > > > > > > (2) Ubuntu; CPU: Interl® Core™ 2 Duo 2 GHz; Mem: 3GB; OS:
> > > Ubuntu
> > > > > > 10.04
> > > > > > > > > (3) Mac; CPU: Interl® Core™ 2 Duo 2 GHz; Mem: 3GB; OS: Mac
> OS
> > > > > 10.7.3
> > > > > > > > >
> > > > > > > >
> > > > > > > > I assume the volunteer does not need to have exactly the same
> > > > machine
> > > > > > > > type as you had.   But they need some stability in the
> > > > configuration.
> > > > > > > >  A performance test might be run first on the AOO 3.4.1
> release
> > > to
> > > > > > > > establish a baseline.  Then re-test on a current 4.0 snapshot
> > > > build.
> > > > > > > > And then re-run on new dev snapshot builds, maybe once a
> week.
> > > > > > > >
> > > > > > > > The goal is to detect performance regressions early, so
> > > developers
> > > > > can
> > > > > > > fix
> > > > > > > > it.
> > > > > > > >
> > > > > > > > The technical challenge here is to preserve a stable machine
> > > > > > > > configuration.  If the machine changes, because of an OS
> > upgrade,
> > > > or
> > > > > a
> > > > > > > > changed hard drive, or a different network environment, or
> > > because
> > > > of
> > > > > > > > a new anti-virus product, then that confuses things.  We need
> > to
> > > > > > > > "control all the variables".
> > > > > > > >
> > > > > > > > One approach to controlling all of the variables is to have a
> > > > machine
> > > > > > > > that is used for nothing but performance testing.  That way
> we
> > > know
> > > > > > > > the machine's base performance does not change.
> > > > > > > >
> > > > > > > > Another approach is re-run the baseline AOO 3.4.1 performance
> > > tests
> > > > > > > > each week.  This is more tolerant of changes in machine
> > > > > configuration,
> > > > > > > > etc.
> > > > > > > >
> > > > > > > > -Rob
> > > > > > > >
> > > > > > > >
> > > > > > > > > The test scenario include: AOO startup, file open, and
> save.
> > > > > > > > >
> > > > > > > > > Volunteers could run performance test on other platforms.
> > > > > > > > >
> > > > > > > > > All the automation scripts could be downloaded in AOO
> > project.
> > > > And
> > > > > it
> > > > > > > is
> > > > > > > > > not difficult to set the automation environment. You could
> > > follow
> > > > > the
> > > > > > > > guide:
> > > > > > > > > http://wiki.openoffice.org/wiki/QA/test_automation_guide
> > > > > > > > >
> > > > > > > > > For any questions, be free to contact with me :)
> > > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> > >
> > >
> > > --
> > > Best Wishes, LiFeng Wang
> > >
> >
>
>
>
> --
> Best Wishes, LiFeng Wang
>



-- 
Regards

Colin McDermott

0433 400 256

Reply via email to