On Sun, 28 Oct 2007, Reece Dunn wrote:
[...]
It would be even better if the tests were also run on real machines,
as that would catch which test failures are VM related (such as the
Direct3D tests).
Sure. However I don't have real Windows machines so I'll leave this as
an exercise for someone
On 27/10/2007, Jakob Eriksson [EMAIL PROTECTED] wrote:
Francois Gouget wrote:
And I have recently put together a script for running winetest in
VMware virtual machines unattended (see my other post). So going
forward I will be running it on Windows 98, Windows XP and Windows
2003
Robert Shearman wrote:
Jakob Eriksson wrote:
[...]
(That Codeweavers do not have such an installation yet, is beoynd me. Or
if you do, please make it automatically submit its findings to
test.winehq.org!)
We do. I've got a machine that regularly runs the test on Windows 2003
on real
Francois Gouget wrote:
And I have recently put together a script for running winetest in
VMware virtual machines unattended (see my other post). So going
forward I will be running it on Windows 98, Windows XP and Windows
2003 nightly.
This is s good. test.winehq.org will became several
Jakob Eriksson wrote:
Alexandre Julliard wrote:
If we require tests to pass on all Windows versions before getting
committed it will drastically reduce the number of tests accepted,
with little benefit. In most cases tests fail on some Windows boxes
because they are too strict in the
Alexandre Julliard wrote:
If we require tests to pass on all Windows versions before getting
committed it will drastically reduce the number of tests accepted,
with little benefit. In most cases tests fail on some Windows boxes
because they are too strict in the behavior they expect, and
Robert Shearman wrote:
We do. I've got a machine that regularly runs the test on Windows 2003
on real hardware:
http://test.winehq.org/data/200710241000/2003_rshearman/report
That's excellent!
However, the tests are run by a service rather than manually by me to
reduce the effort needed.
Looking at yesterday's test results is depressing:
http://test.winehq.org/data/200710241000/
Just looking at the pretty colors may not make this very obvious, but
the state of the tests is APPALLING.
Successes | Failures | Failure rate | Not Run
WinXP-1 |260| 53
Just looking at the pretty colors may not make this very obvious, but
the state of the tests is APPALLING.
Agreed. I wonder how much of it has to do with not noticing that the
tests have failed?
I may just be transforming the problem from an easy one (we shouldn't
be lazy about checking the
Juan Lang wrote:
Just looking at the pretty colors may not make this very obvious, but
the state of the tests is APPALLING.
Agreed. I wonder how much of it has to do with not noticing that the
tests have failed?
I may just be transforming the problem from an easy one (we shouldn't
On Thu, 2007-10-25 at 09:38 -0700, Juan Lang wrote:
I suspect the biggest problem is keeping the winetest executable up to
date on the systems. If the test system can't compile the tests, it
can't easily perform a regression test. What's the biggest obstacle
to that?
There's a lot of
There's a lot of machinery needed on a box to rebuild wine, and
Windows boxes typically have no development tools whatsoever.
Okay, but the toolchain to build winetest is relatively small, isn't
it? Could we include that in the Windows version of the tests in
order to speed up our response to
Juan Lang [EMAIL PROTECTED] writes:
There's a lot of machinery needed on a box to rebuild wine, and
Windows boxes typically have no development tools whatsoever.
Okay, but the toolchain to build winetest is relatively small, isn't
it? Could we include that in the Windows version of the
13 matches
Mail list logo