Re: QA tasks available

2011-12-06 Thread Brendan Donegan

On 05/12/11 16:24, Gema Gomez wrote:

Dear QA Team,

as promised, here it is a list of tasks that need to be done and we are
in the process of doing that you could own if you have the time:

- ISO testing tasks
(https://blueprints.launchpad.net/ubuntu/+spec/other-p-builds-smoke-testing):

   1) Compile a list of applications that are installed by default by the
ISO installers (one for Desktop, one for Server) and propose two or
three basic test cases that could be run post install giving us basic
confidence that the ISO is good for further testing (i.e. compile a list
of post-install smoke tests that we could run with Jenkins).
- This task is not about generating code, but about thinking of what
packages of the ones installed are important and worth testing in a
daily test suite. We could split it in different tasks for different
people if we generate first a list of apps that we can use for the
generation of test cases.
I'd like to start by encouraging some debate here. As a developer of 
Ubuntu, there are certain tools that are critical such as Python itself. 
However to what extent is it worthwhile testing that Python works for 
example? I guess it can't do any harm.


One thing I would like to see tested is apt - the ability to install 
packages is critical.


Personally I think the most important thing is to test as many different 
configurations as possible (which we may already be doing), such as 
encrypted home and selecting (or not) to install non-free 
software/updates while installing.

   2) We need to fix the existing test cases in the tracker and convert
them to a better, more understandable format. Basically we need to
convert them to unambiguous and meaningful test cases. Some of them are
redundant, some of them are too long to be just one test case, some
others do not make sense anymore. This is a tidy up task that needs to
be done.
Is this to be done before putting them into Litmus? I can gladly help 
either way.


- Metrics
(https://blueprints.launchpad.net/ubuntu/+spec/other-p-qa-metrics):
   3) I have some tasks here that could use some help. We need to look at
the codebase of Ubuntu main and see how to instrument the code so that
we can start generating code coverage metrics. This is about compiling
the Ubuntu code with gcov and generating binaries that can be used
(still to be seen how to install them) for this end.
- This task requires code in-depth knowledge and familiarity on how
things are built and can be changed to build in a different way. We
should decide where to start instrumenting and why.
The software-center developers have a very good implementation of code 
coverage reports, so it's worth looking at that package (in the 'tests' 
directory) at least for how to do this with a Python application. This 
is the task I'd be most interested in helping with.


   4) Look into how to do test escape analysis with launchpad. TEA is an
analysis that will tell us, after Precise, if we missed some problems
that were found by someone after we did our testing and that should help
us understand whether we should be adding new test cases in those
missed areas or not.
Are we tagging bugs that 'we' found (I'm assuming the 'we' here means 
the routine testing such as smoke testing and ISO testing rather than 
regular use by end-users) in some way?


   5) Gather test cases from defects. This is about making a list of
defects that have been fixed for Oneiric and that have a set of steps to
reproduce the problem that needs to be gathered and written into a
proper test case.

Does someone already have this list? Release managers for example?


- Test Case Management System
(https://blueprints.launchpad.net/ubuntu/+spec/other-p-qa-test-case-management-tool)
6) Still not available, but when it is, review and give feedback
about litmus and its usability. Also help decide how to configure it to
make it more suitable for the Ubuntu community testing.

Just let us know when it's ready ;)



- QA Backlog tasks
(https://blueprints.launchpad.net/ubuntu/+spec/other-p-qa-backlog)
   7) Review and change the wiki to reflect the new approach to QA.


Please, bear in mind that since we don't have the test case management
tool up and running yet, we need to keep our test cases in text files or
open office documents (prefereably spreadsheets) for now. As soon as we
have chosen a tool to handle them, we will be using that.

I have added a template at the bottom of the test cases page, feel free
to use it for your newly generated test cases:
https://wiki.ubuntu.com/QATeam/TestCase

You can also modify it to contain a link to the old test case whenever
you are improving an existing test case.


Let us know which tasks you are interested in and I will be mapping
tasks in the blueprints to people, so that we keep track of what
everyone's doing and do not duplicate work. I have numbered the tasks to
make it easier to discuss about them.

You don't need to take an entire task if you feel you 

Re: QA tasks available

2011-12-06 Thread Gema Gomez
After reading through all the emails I have put together a little table
that shows who is interested in what, so far only task 1 has been started.

https://wiki.ubuntu.com/QATeam/TasksPrecise

Thanks,
Gema

On 06/12/11 09:29, Brendan Donegan wrote:
 On 05/12/11 16:24, Gema Gomez wrote:
 Dear QA Team,

 as promised, here it is a list of tasks that need to be done and we are
 in the process of doing that you could own if you have the time:

 - ISO testing tasks
 (https://blueprints.launchpad.net/ubuntu/+spec/other-p-builds-smoke-testing):


1) Compile a list of applications that are installed by default by the
 ISO installers (one for Desktop, one for Server) and propose two or
 three basic test cases that could be run post install giving us basic
 confidence that the ISO is good for further testing (i.e. compile a list
 of post-install smoke tests that we could run with Jenkins).
 - This task is not about generating code, but about thinking of what
 packages of the ones installed are important and worth testing in a
 daily test suite. We could split it in different tasks for different
 people if we generate first a list of apps that we can use for the
 generation of test cases.
 I'd like to start by encouraging some debate here. As a developer of
 Ubuntu, there are certain tools that are critical such as Python itself.
 However to what extent is it worthwhile testing that Python works for
 example? I guess it can't do any harm.
 
 One thing I would like to see tested is apt - the ability to install
 packages is critical.
 
 Personally I think the most important thing is to test as many different
 configurations as possible (which we may already be doing), such as
 encrypted home and selecting (or not) to install non-free
 software/updates while installing.
2) We need to fix the existing test cases in the tracker and convert
 them to a better, more understandable format. Basically we need to
 convert them to unambiguous and meaningful test cases. Some of them are
 redundant, some of them are too long to be just one test case, some
 others do not make sense anymore. This is a tidy up task that needs to
 be done.
 Is this to be done before putting them into Litmus? I can gladly help
 either way.

 - Metrics
 (https://blueprints.launchpad.net/ubuntu/+spec/other-p-qa-metrics):
3) I have some tasks here that could use some help. We need to look at
 the codebase of Ubuntu main and see how to instrument the code so that
 we can start generating code coverage metrics. This is about compiling
 the Ubuntu code with gcov and generating binaries that can be used
 (still to be seen how to install them) for this end.
 - This task requires code in-depth knowledge and familiarity on how
 things are built and can be changed to build in a different way. We
 should decide where to start instrumenting and why.
 The software-center developers have a very good implementation of code
 coverage reports, so it's worth looking at that package (in the 'tests'
 directory) at least for how to do this with a Python application. This
 is the task I'd be most interested in helping with.

4) Look into how to do test escape analysis with launchpad. TEA is an
 analysis that will tell us, after Precise, if we missed some problems
 that were found by someone after we did our testing and that should help
 us understand whether we should be adding new test cases in those
 missed areas or not.
 Are we tagging bugs that 'we' found (I'm assuming the 'we' here means
 the routine testing such as smoke testing and ISO testing rather than
 regular use by end-users) in some way?

5) Gather test cases from defects. This is about making a list of
 defects that have been fixed for Oneiric and that have a set of steps to
 reproduce the problem that needs to be gathered and written into a
 proper test case.
 Does someone already have this list? Release managers for example?

 - Test Case Management System
 (https://blueprints.launchpad.net/ubuntu/+spec/other-p-qa-test-case-management-tool)

 6) Still not available, but when it is, review and give feedback
 about litmus and its usability. Also help decide how to configure it to
 make it more suitable for the Ubuntu community testing.
 Just let us know when it's ready ;)


 - QA Backlog tasks
 (https://blueprints.launchpad.net/ubuntu/+spec/other-p-qa-backlog)
7) Review and change the wiki to reflect the new approach to QA.


 Please, bear in mind that since we don't have the test case management
 tool up and running yet, we need to keep our test cases in text files or
 open office documents (prefereably spreadsheets) for now. As soon as we
 have chosen a tool to handle them, we will be using that.

 I have added a template at the bottom of the test cases page, feel free
 to use it for your newly generated test cases:
 https://wiki.ubuntu.com/QATeam/TestCase

 You can also modify it to contain a link to the old test case whenever
 you are improving an existing test case.


 

Re: QA tasks available

2011-12-06 Thread J
I'd be interested in helping with the test case cleanup and smoke test
brainstorming (tasks 1 and 2).

-- 
Ubuntu-qa mailing list
Ubuntu-qa@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-qa


Re: QA tasks available

2011-12-06 Thread Brian Murray
On Mon, Dec 05, 2011 at 04:24:19PM +, Gema Gomez wrote:
 Dear QA Team,
 
 as promised, here it is a list of tasks that need to be done and we are
 in the process of doing that you could own if you have the time:
 
 - ISO testing tasks
 (https://blueprints.launchpad.net/ubuntu/+spec/other-p-builds-smoke-testing):
 
   1) Compile a list of applications that are installed by default by the
 ISO installers (one for Desktop, one for Server) and propose two or
 three basic test cases that could be run post install giving us basic
 confidence that the ISO is good for further testing (i.e. compile a list
 of post-install smoke tests that we could run with Jenkins).
   - This task is not about generating code, but about thinking of what
 packages of the ones installed are important and worth testing in a
 daily test suite. We could split it in different tasks for different
 people if we generate first a list of apps that we can use for the
 generation of test cases.
   2) We need to fix the existing test cases in the tracker and convert
 them to a better, more understandable format. Basically we need to
 convert them to unambiguous and meaningful test cases. Some of them are
 redundant, some of them are too long to be just one test case, some
 others do not make sense anymore. This is a tidy up task that needs to
 be done.
 
 - Metrics
 (https://blueprints.launchpad.net/ubuntu/+spec/other-p-qa-metrics):
   3) I have some tasks here that could use some help. We need to look at
 the codebase of Ubuntu main and see how to instrument the code so that
 we can start generating code coverage metrics. This is about compiling
 the Ubuntu code with gcov and generating binaries that can be used
 (still to be seen how to install them) for this end.
   - This task requires code in-depth knowledge and familiarity on how
 things are built and can be changed to build in a different way. We
 should decide where to start instrumenting and why.
 
   4) Look into how to do test escape analysis with launchpad. TEA is an
 analysis that will tell us, after Precise, if we missed some problems
 that were found by someone after we did our testing and that should help
 us understand whether we should be adding new test cases in those
 missed areas or not.
 
   5) Gather test cases from defects. This is about making a list of
 defects that have been fixed for Oneiric and that have a set of steps to
 reproduce the problem that needs to be gathered and written into a
 proper test case.

Here are some Launchpad searches that should help:

Bugs with an oneiric task that are Fix Released and have been tagged
'testcase'.

https://bugs.launchpad.net/ubuntu/oneiric/+bugs?field.searchtext=orderby=-importancefield.status%3Alist=FIXRELEASEDassignee_option=anyfield.assignee=field.bug_reporter=field.bug_supervisor=field.bug_commenter=field.subscriber=field.component-empty-marker=1field.tag=testcasefield.tags_combinator=ANYfield.status_upstream-empty-marker=1field.has_cve.used=field.omit_dupes.used=field.omit_dupes=onfield.affects_me.used=field.has_no_package.used=field.has_patch.used=field.has_branches.used=field.has_branches=onfield.has_no_branches.used=field.has_no_branches=onfield.has_blueprints.used=field.has_blueprints=onfield.has_no_blueprints.used=field.has_no_blueprints=onsearch=Search

Bugs about Ubuntu that are Fix Released and tagged 'testcase' and
'oneiric'.  This one is timing out and causing an OOPs although it
would be easy to get a list of the same bugs using the Launchpad API.

https://bugs.launchpad.net/ubuntu/+bugs?field.searchtext=orderby=-importancefield.status%3Alist=FIXRELEASEDassignee_option=anyfield.assignee=field.bug_reporter=field.bug_supervisor=field.bug_commenter=field.subscriber=field.component-empty-marker=1field.tag=oneiric+testcasefield.tags_combinator=ALLfield.status_upstream-empty-marker=1field.has_cve.used=field.omit_dupes.used=field.omit_dupes=onfield.affects_me.used=field.has_no_package.used=field.has_patch.used=field.has_branches.used=field.has_branches=onfield.has_no_branches.used=field.has_no_branches=onfield.has_blueprints.used=field.has_blueprints=onfield.has_no_blueprints.used=field.has_no_blueprints=onsearch=Search

My bug bot automatically tags bugs with the words TEST CASE in the
description 'testcase' and has been doing so for some time now.

--
Brian Murray

-- 
Ubuntu-qa mailing list
Ubuntu-qa@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-qa


Re: QA tasks available

2011-12-05 Thread Aaron
I'd be interested in starting on ISO Testing and task 1.  I think it would
give me some good initial exposure to the overall picture--installed
applications, some testing priority and making some ISO smoke test--test
cases.  If that sounds okay?

Is there a notification for when packages with changes are submitted to the
code base? Or some other process for knowing what packages have been added
and changed? Has there been any initial discussions on testing priority?
I'm sure I can come up with an initial list and we can start from there if
not.

I'd be interested in having a clear picture of what has changed between
builds/releases and help determine what should be included in the smoke
testing, etc.

Getting a list of the packages isn't hard so I was thinking about how to
determine testing priority and making the test cases, etc.

Unless someone is already on it or if you'd like me to look at something
else instead...

Thanks!

Aaron

On Mon, Dec 5, 2011 at 8:24 AM, Gema Gomez
gema.gomez-sol...@canonical.comwrote:

 Dear QA Team,

 as promised, here it is a list of tasks that need to be done and we are
 in the process of doing that you could own if you have the time:

 - ISO testing tasks
 (
 https://blueprints.launchpad.net/ubuntu/+spec/other-p-builds-smoke-testing
 ):

  1) Compile a list of applications that are installed by default by the
 ISO installers (one for Desktop, one for Server) and propose two or
 three basic test cases that could be run post install giving us basic
 confidence that the ISO is good for further testing (i.e. compile a list
 of post-install smoke tests that we could run with Jenkins).
- This task is not about generating code, but about thinking of what
 packages of the ones installed are important and worth testing in a
 daily test suite. We could split it in different tasks for different
 people if we generate first a list of apps that we can use for the
 generation of test cases.
  2) We need to fix the existing test cases in the tracker and convert
 them to a better, more understandable format. Basically we need to
 convert them to unambiguous and meaningful test cases. Some of them are
 redundant, some of them are too long to be just one test case, some
 others do not make sense anymore. This is a tidy up task that needs to
 be done.

 - Metrics
 (https://blueprints.launchpad.net/ubuntu/+spec/other-p-qa-metrics):
  3) I have some tasks here that could use some help. We need to look at
 the codebase of Ubuntu main and see how to instrument the code so that
 we can start generating code coverage metrics. This is about compiling
 the Ubuntu code with gcov and generating binaries that can be used
 (still to be seen how to install them) for this end.
- This task requires code in-depth knowledge and familiarity on how
 things are built and can be changed to build in a different way. We
 should decide where to start instrumenting and why.

  4) Look into how to do test escape analysis with launchpad. TEA is an
 analysis that will tell us, after Precise, if we missed some problems
 that were found by someone after we did our testing and that should help
 us understand whether we should be adding new test cases in those
 missed areas or not.

  5) Gather test cases from defects. This is about making a list of
 defects that have been fixed for Oneiric and that have a set of steps to
 reproduce the problem that needs to be gathered and written into a
 proper test case.

 - Test Case Management System
 (
 https://blueprints.launchpad.net/ubuntu/+spec/other-p-qa-test-case-management-tool
 )
   6) Still not available, but when it is, review and give feedback
 about litmus and its usability. Also help decide how to configure it to
 make it more suitable for the Ubuntu community testing.


 - QA Backlog tasks
 (https://blueprints.launchpad.net/ubuntu/+spec/other-p-qa-backlog)
  7) Review and change the wiki to reflect the new approach to QA.


 Please, bear in mind that since we don't have the test case management
 tool up and running yet, we need to keep our test cases in text files or
 open office documents (prefereably spreadsheets) for now. As soon as we
 have chosen a tool to handle them, we will be using that.

 I have added a template at the bottom of the test cases page, feel free
 to use it for your newly generated test cases:
 https://wiki.ubuntu.com/QATeam/TestCase

 You can also modify it to contain a link to the old test case whenever
 you are improving an existing test case.


 Let us know which tasks you are interested in and I will be mapping
 tasks in the blueprints to people, so that we keep track of what
 everyone's doing and do not duplicate work. I have numbered the tasks to
 make it easier to discuss about them.

 You don't need to take an entire task if you feel you can only work on
 two or three test cases, you just say so and we will make sure nobody
 else is working on the same ones as you.

 Looking forward to your answers!
 Gema


 --
 Gema 

Re: QA tasks available

2011-12-05 Thread Alex Lourie
On Mon, Dec 5, 2011 at 6:24 PM, Gema Gomez
gema.gomez-sol...@canonical.comwrote:

 Dear QA Team,

 as promised, here it is a list of tasks that need to be done and we are
 in the process of doing that you could own if you have the time:

 - ISO testing tasks
 (
 https://blueprints.launchpad.net/ubuntu/+spec/other-p-builds-smoke-testing
 ):

  1) Compile a list of applications that are installed by default by the
 ISO installers (one for Desktop, one for Server) and propose two or
 three basic test cases that could be run post install giving us basic
 confidence that the ISO is good for further testing (i.e. compile a list
 of post-install smoke tests that we could run with Jenkins).
- This task is not about generating code, but about thinking of what
 packages of the ones installed are important and worth testing in a
 daily test suite. We could split it in different tasks for different
 people if we generate first a list of apps that we can use for the
 generation of test cases.
  2) We need to fix the existing test cases in the tracker and convert
 them to a better, more understandable format. Basically we need to
 convert them to unambiguous and meaningful test cases. Some of them are
 redundant, some of them are too long to be just one test case, some
 others do not make sense anymore. This is a tidy up task that needs to
 be done.

 - Metrics
 (https://blueprints.launchpad.net/ubuntu/+spec/other-p-qa-metrics):
  3) I have some tasks here that could use some help. We need to look at
 the codebase of Ubuntu main and see how to instrument the code so that
 we can start generating code coverage metrics. This is about compiling
 the Ubuntu code with gcov and generating binaries that can be used
 (still to be seen how to install them) for this end.
- This task requires code in-depth knowledge and familiarity on how
 things are built and can be changed to build in a different way. We
 should decide where to start instrumenting and why.

  4) Look into how to do test escape analysis with launchpad. TEA is an
 analysis that will tell us, after Precise, if we missed some problems
 that were found by someone after we did our testing and that should help
 us understand whether we should be adding new test cases in those
 missed areas or not.

  5) Gather test cases from defects. This is about making a list of
 defects that have been fixed for Oneiric and that have a set of steps to
 reproduce the problem that needs to be gathered and written into a
 proper test case.

 - Test Case Management System
 (
 https://blueprints.launchpad.net/ubuntu/+spec/other-p-qa-test-case-management-tool
 )
   6) Still not available, but when it is, review and give feedback
 about litmus and its usability. Also help decide how to configure it to
 make it more suitable for the Ubuntu community testing.


 - QA Backlog tasks
 (https://blueprints.launchpad.net/ubuntu/+spec/other-p-qa-backlog)
  7) Review and change the wiki to reflect the new approach to QA.


 Please, bear in mind that since we don't have the test case management
 tool up and running yet, we need to keep our test cases in text files or
 open office documents (prefereably spreadsheets) for now. As soon as we
 have chosen a tool to handle them, we will be using that.

 I have added a template at the bottom of the test cases page, feel free
 to use it for your newly generated test cases:
 https://wiki.ubuntu.com/QATeam/TestCase

 You can also modify it to contain a link to the old test case whenever
 you are improving an existing test case.


 Let us know which tasks you are interested in and I will be mapping
 tasks in the blueprints to people, so that we keep track of what
 everyone's doing and do not duplicate work. I have numbered the tasks to
 make it easier to discuss about them.

 You don't need to take an entire task if you feel you can only work on
 two or three test cases, you just say so and we will make sure nobody
 else is working on the same ones as you.

 Looking forward to your answers!
 Gema


 --
 Gema Gomez-Solanogema.gomez-sol...@canonical.com
 QA Team  https://launchpad.net/~gema.gomez
 Canonical Ltd.   http://www.canonical.com

 --
 Ubuntu-qa mailing list
 Ubuntu-qa@lists.ubuntu.com
 Modify settings or unsubscribe at:
 https://lists.ubuntu.com/mailman/listinfo/ubuntu-qa



Hi Gema, all

I'll be interested in :

1. I'd like to participate in the discussions.
2. I really think it's important. Making concise and clear test cases would
help us tremendously to convert some of them to automatic tests and also,
maybe, bring new people.

4. I would be interested in helping here as well.

5. I may be interested, although I possible lack the knowledge. I have to
see how it evolves first.

6,7 - definitely interested in helping here.


-- 
Alex Lourie
-- 
Ubuntu-qa mailing list
Ubuntu-qa@lists.ubuntu.com
Modify settings or unsubscribe at: 
https://lists.ubuntu.com/mailman/listinfo/ubuntu-qa