On Mon, Jun 18, 2012 at 11:49:46AM +0200, Rick Spencer wrote:
> On Mon, Jun 18, 2012 at 7:02 AM, Martin Pitt <martin.p...@ubuntu.com> wrote:
> > Sebastien Bacher [2012-06-15 17:26 +0200]:
> >> Can we just drop the image rolling part of milestones?
> >
> > Our automated tests are still waaaay to incomplete for this step. In
> > manual testing we have found quite a number of real deal-breaker bugs
> > which the automatic tests didn't pick up. We also need to test the
> > current images on a wider range of real iron; which is something our
> > automated QA could do one day, but doesn't right now.
> >
> > So regular manual testing rounds are still required, and the points
> > when we do them might just as well be called "milestones".
> 
> But if the focus is testing, we should optimize the schedule around
> testing. For example, I think Ubuntu would benefit from more frequent
> "rounds" of such in depth testing than the current alpha/beta
> milestones provide. (I think every 2 weeks would be a good cadence).

This could be very beneficial if it were more aggressively organized.

We did something similar with proprietary driver testing one release a
few years back.  We had people "join" a team, and then had them install
isos and run through a checklist once a week.  I found it to be quite
valuable, but you had to be very organized for it to be useful.

So this wasn't just a "install the image and file bugs" exercise, but an
deliberate look for serious regressions.  By having each person provide
a continuous series of data points we could spot anomalies much more
easily.  If someone is installing things exactly the same way, on the
same hardware, every week, and all of a sudden one week it fails, that
helps you narrow things down a lot.  Or equally important is seeing that
a fix you roll out does indeed restore functionality across multiple
testers.

The key was to be very specific in the data collection, else you can
generate a lot of noise quickly.  Make a printable survey form they can
fill in as they go through the checklist procedure, and a system info
dumping tool that captures all the logs when they're done that might be
needed for bug reports.  The QA team has a tool for capturing all this
data and showing it in a tabulated form so you can spot patterns and
changes over time.

The most important thing is that the data actually get used.  This
testing can take a fair bit of time, but if the testers know their
efforts are helping to make things tangibly better they can really get
passionate about doing it.

Bryce



-- 
ubuntu-devel mailing list
ubuntu-devel@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-devel

Reply via email to