Re: Outdated Laptop Test plan makes testing a less pleasant experience

2010-10-24 Thread Sergio Zanchetta
2010/10/15 Daniel Kulesz daniel.kul...@informatik.uni-stuttgart.de

Hello Daniel!

exc-001: There is no example folder / symlink anymore, the content was moved
 to /usr/share/example-content already some time ago. It is not clear which
 files / applications are targetted, and what the expected output is. (not
 only
 the applications should start correctly, but they should also handle the
 files
 correctly)


In fact I don't know how to handle this, because I don't know actual and
future plans about this folder.
Suggestions and comments are welcome.


 snr-001: Entering the password is not needed when testing via LiveCD, this
 should be mentioned.


Done.


 snr-002: Executing more than 30 cycles of sleep/wakeup could lead to
 hardware
 damage, limiting the cycles to 5 should be really enough!


This kind of test is present even in checkbox, I don't have an answer about
this.
Maybe a member of kernel team could clarify this.


 him-001: It is not clear, if this applies only to internal card readers
 only.


It should be clear because it's in the system section. External card
reader tests are in USB section (hum-*).


 khk-001: The expected behaviour is, that Ubuntu also shows the volume meter
 on
 screen; but on many older machines the volume is adjusted, but the user is
 not
 given appropiate (visual) feedback. This should be included in this test
 case
 as well, to spot the affected machines.


This test is about checking if multimedia hotkeys do the appropriate action
after pressing them. (e.g. lowering volume)
Some laptop hotkeys are not mapped yet and this need to be reported (for
example).
The issue you are reporting is a visual one and even if it doesn't seem to
be an hardware problem it should be reported as bug as well, specifying it
on the bug description.



 hsi-001: the media file fables_01_01_aesop.spx seems to be outdated,
 there
 is other media included in 10.10; It should be also verified, that the
 sound
 plays correctly and does not produce any stottering
 hhi-001: see hsi-001


Now it should be ok.


 his-001: Changing refresh rate not covered
 his-001: (video in general) there is no test case to test for fluent video
 playback (maybe except exc-001)


We are testing laptops, that should be important using crt monitors.
About fluent video playback,exc-001 test could be sufficient to cover it.
In fact, an application test (mediaplayer) should be considered, not an
hardware one.


 bfu-001: Not clear, what is meant by app - is this the whole operating
 system?


It should be ok now.


 hpu-002: Ubuntu has various bugs regarding wrong paper size (especially A4
 vs.
 letter) in some drivers, this should be verified here as well.


As mentioned before, this could be considered in an application test, not an
hardware one.
Anyway this shouldn't prevent you or someone else to report and link bugs
you find doing a test, even if not directly related to it.


 hpu-002, hds-001, hds-002, hds-003, hds-004 : missing here - at least some
 of
 them should be included into LaptopTesting as well, this is pretty
 important
 for many mobile users


Good point.



 Other, general aspects:
 * the template for the Laptop entries
 (https://wiki.ubuntu.com/Testing/Laptop/Reports/Template) is *very*
 incomplete. Most people seem to include info like lshw output, dmidecode
 details etc. - why is this all missing in the template? Therefore, instead
 of
 re-using the template, testers have to copy/paste the structure around from
 existing reports.


This is because the original template included just a table, then some
people start adding other info on their own without mentioning in ML or
changing template.
For Natty I'm going to enrich it with other data to be filled in by users.


 * the main wiki page (https://wiki.ubuntu.com/Testing/Laptop/Reports) has
 no
 consistency regarding the naming of the tested editions


What do you mean?
Putting the edition name in that page should be sufficient.



 It would make testing Ubuntu (Pre-)Releases a much more pleasant
 experience,
 if you could update the test plan and look into the mentioned issues. I
 strongly believe there are much more people out there who would be willing
 to
 contribute to systematic Laptop Testing, but finding the wiki pages and
 test
 plan in the current (not very useable) state could be pretty frustrating.


Thank you very much Daniel for your suggestions.
The laptop testing is a community project so everyone can help and
contribute updating testcases and suggesting improvements etc.
In the future feel free to keep on sharing your thoughts and proposing your
ideas here in ML so we can discuss together.


Regards,
Sergio.

P.S.: Apart from issues you mentioned, I can say that the main problem of
this project is the use of wiki pages to report results.
I proposed a blueprint to integrate the laptop testing in the ISO tracker,
and it'll be discussed at the Ubuntu Developer Summit for Natty [ยน]

P.S.2: Thanks to people that helped updating 

Re: Outdated Laptop Test plan makes testing a less pleasant experience

2010-10-24 Thread Daniel Kulesz
Hello Sergio,

thanks a lot for your comprehensive answer. I will try to further clarify / 
refine some of the mentioned points:

 exc-001: There is no example folder / symlink anymore, the content was
 moved
 
  to /usr/share/example-content already some time ago. It is not clear
  which files / applications are targetted, and what the expected output
  is. (not only
  the applications should start correctly, but they should also handle the
  files
  correctly)
 
 In fact I don't know how to handle this, because I don't know actual and
 future plans about this folder.
 Suggestions and comments are welcome.
As far as I understood it, the main goal of the LaptopTesting effort is to do 
a system test for *future* Ubuntu Releases, not for past ones. Therefore - if 
there are any system changes which require changes in existing test cases 
(like with the examples folder) - the changes should be reflected in the test 
plan, so that the testers can execute the test cases on the future (current 
alpha/beta/rc...) pre-releases. Perceiving traceability could be achieved 
simply done by mentioning in the Test Report, against which Test plan 
(=currently that would be the version of the wiki pages) the tests have been 
carried out.

 
  snr-001: Entering the password is not needed when testing via LiveCD,
  this should be mentioned.
 
 Done.
Looks good. Maybe it should be considered tomark the optional test steps more 
eye-catching, i.e.:
3b. (only applies in Live Environment) You should be presented with a password 
prompt; enter your password

  snr-002: Executing more than 30 cycles of sleep/wakeup could lead to
  hardware
  damage, limiting the cycles to 5 should be really enough!
 
 This kind of test is present even in checkbox, I don't have an answer about
 this.
 Maybe a member of kernel team could clarify this.
I would appreciate this very much. Could you try to get an answer from the 
kernel team? (I dunno if they read this list)

  him-001: It is not clear, if this applies only to internal card readers
  only.
 
 It should be clear because it's in the system section. External card
 reader tests are in USB section (hum-*).
Alright, but then some description (just one sentence) of what is meant by 
system level tests would be nice in the test plan, maybe with the remark, 
that tests for hardware which is not present shall be skipped.

 
  khk-001: The expected behaviour is, that Ubuntu also shows the volume
  meter on
  screen; but on many older machines the volume is adjusted, but the user
  is not
  given appropiate (visual) feedback. This should be included in this test
  case
  as well, to spot the affected machines.
 
 This test is about checking if multimedia hotkeys do the appropriate action
 after pressing them. (e.g. lowering volume)
 Some laptop hotkeys are not mapped yet and this need to be reported (for
 example).
 The issue you are reporting is a visual one and even if it doesn't seem
 to be an hardware problem it should be reported as bug as well, specifying
 it on the bug description.
Well yes, it's a usability problem at first sight. But actually there is 
hardware where the driver supports this visual feedback (i.e. on my Thinkpad 
X301) and there is hardware, where the driver does not support that. I guess 
this is related to ACPI event handling etc. - but since the software already 
supports the visual feedback, it's a system bug if the driver does not notify 
the software about the action that happened in here.

  hsi-001: the media file fables_01_01_aesop.spx seems to be outdated,
  there
  is other media included in 10.10; It should be also verified, that the
  sound
  plays correctly and does not produce any stottering
  hhi-001: see hsi-001
 
 Now it should be ok.
Great!

  his-001: Changing refresh rate not covered
  his-001: (video in general) there is no test case to test for fluent
  video playback (maybe except exc-001)
 
 We are testing laptops, that should be important using crt monitors.
 About fluent video playback,exc-001 test could be sufficient to cover it.
 In fact, an application test (mediaplayer) should be considered, not an
 hardware one.
This is also a video driver, and therefore a system issue. On some drivers 
(i.e. Intel GM855) the playback only works in normal window but not in 
fullscreen. There you get either a disorted picture, or not the full available 
screen size is used for feedback. So, there are laptops where it works fine 
(with the same media player version and media file) and there are laptops 
where it fails.

  bfu-001: Not clear, what is meant by app - is this the whole operating
  system?
 
 It should be ok now.
Perfect!

 
  hpu-002: Ubuntu has various bugs regarding wrong paper size (especially
  A4 vs.
  letter) in some drivers, this should be verified here as well.
 
 As mentioned before, this could be considered in an application test, not
 an hardware one.
 Anyway this shouldn't prevent you or someone else to report and link bugs
 you find