On Jan 24, 2013, at 6:16 PM, Igor Stasenko <siguc...@gmail.com> wrote:

> On 24 January 2013 18:12, Camillo Bruni <camillobr...@gmail.com> wrote:
>> I've been thinking these days on how to implement regression tests.
>> Obviously the core requirement is to launch a separate image and capture 
>> terminal output and/or result files.
>> 
>> For now I will put them into a separate repository:
>> 
>>        http://smalltalkhub.com/#!/~Pharo/regression-testing
>> 
>> Some question:
>> - Is there a test ConfigurationOf* which can be used to try loading 
>> different versions / groups?
>> - Is it ok to rely on OSProcess for now to run launch a separate new image?
> 
> + 1
> we need such tests.
> And actually it would be even useful to know which code crashing the
> system , e.g.
> we can even have a tests which succeed only if they crashing VM.

Yes!

We did a first (ad-hoc) regression tester for Opal:

        https://ci.inria.fr/rmod/job/OpalRegression/

We just have the tests in different packages and take care which to run when. 
The regression-test job
runs first slow tests that can not do any harm (compiling all method, but not 
installing them),
then as a second to run tests that recompile the whole image, this we save and 
then run all tests
on that:


./vm.sh $JOB_NAME.image test --junit-xml-output "OpalCompiler-RegressionTests.*"
./vm.sh $JOB_NAME.image test --junit-xml-output 
"OpalCompiler-DestructiveRegressionTests.*" --save
./vm.sh $JOB_NAME.image test --junit-xml-output ".*"

With that in place we can be quire sure that a change done on the compiler is 
not doing anything wrong.
(and that speeds up development quite a lot…)

        Marcus


Reply via email to