Jasha Joachimsthal
Europe - Amsterdam - Oosteinde 11, 1017 WT Amsterdam - +31(0)20 522 4466
US - Boston - 1 Broadway, Cambridge, MA 02142 - +1 877 414 4776 (toll free)
www.onehippo.com
On 12 April 2012 03:59, Franklin, Matthew B.<[email protected]> wrote:
-----Original Message-----
From: Marlon Pierce [mailto:[email protected]]
Sent: Tuesday, April 10, 2012 12:01 PM
To: [email protected]
Subject: Re: Quality assurance steps for Rave
I started a checklist of tests at
http://wiki.apache.org/rave/ReleaseManagement/QualityAssurance, as you
have seen from the [Rave Wiki] messages.
Good set of tests. I had been going through most of these on my own
before each release, but that will become impossible to keep up with as the
functionality grows.
* Assuming the user interface tests are done manually, updating the wiki
every time would be cruddy. Is there good test/assurance process
management software out there? Should we do this with an online (Google)
spreadsheet?
I say create a blocking Pre-release test task in Jira and have subtasks
for each "section" of the tests. (we can use copy so it isn't such a manual
task)
* Do we want instead to actively maintain selenium tests? I started this
but
let it drop because these only worked in Firefox, and Firefox's frequent
updates broke the tests. The tests themselves would have to be actively
maintained--if you change or add a user interface feature, you need to
also
update the test.
I think if we develop the simplest possible tests to protect against
regression, it could work without too much effort. We need to make it an
explicit part of the process though. It has also been my experience that
recorded test cases don't work without some modification. I have also had
decent success in keeping code-based (not selenese) tests working....
* Are there other recommendations beside Selenium?
I've just created a test project [1] with JBehave [2] which uses Selenium
under the hood.
Scenario: User creates a new account and logs in into the portal
When I go to "http://localhost:8080/portal"
Then I see the login page
When I follow the new account link
Then I get the new account form
When I fill in the form with username "newuser" password "password"
confirmpassword "password" email "[email protected]"
And I submit the new account form
Then I see the login page
And A message appears "Account successfully created"
When I fill in the login form with username "newuser" password "password"
Then I see my portal page with the add new widgets box
you can run it with mvn clean install or run the NewUserStories class from
your IDE.
It does not clean up the newly created user so you cannot run it twice
successfully without cleaning up the user.
[1] https://github.com/jashaj/PortalTests
[2] http://jbehave.org/
>
Marlon
On 4/6/12 3:32 PM, Raminderjeet Singh wrote:
+1 with Marlon on having a checklist of features. I know going forward
this
list can grow and we may need more people to verify the build.
Based on my experience doing 0.10 release, Matt and others have done a
great job putting all the steps together into scripts. Thanks!
After the code freeze announcement, Release manager can tag the current
code and verification can be done on that particular tag. One developer
(release manager also) can verify the code tag for the feature list and
if very
thing looks good can do the release based on the tag. Matt can comment
more as i have to still understand when the pom versions are updated.
Thanks
Raminder
On Apr 6, 2012, at 2:53 PM, Marlon Pierce wrote:
I'd like to propose the following.
* Develop a list of Rave features that should be tested, put this on
the wiki,
and update it every time a new feature is added. I'm happy to get this
started. Recommendations for tools to automate some of this are welcome.
* Regular testing of the above feature list (not just in preparation for
releases). We would need a record of this (who tested, when, etc). If
we do
this manually, we could keep records by updating the wiki, posting to the
dev
list, etc. Recommendations for better tools are of course welcome.
* All features should be tested before release by at least one person.
The
Release Manager should coordinate this. We could continue to do this as
part
of the current Release Candidate process, but it may be better to have an
intermediate SVN tag that can go through QA before we start an official
vote.
This would allow us to keep the trunk open for commits while we test and
avoid canceled releases.
I'm not a QA expert, so comments welcome.
Marlon