Hi Sara,
Here are some answers to your questions.
Hope this helps,
-Rick
On 2/13/12 1:14 PM, Sara Nilsson wrote:
To gatekeepers, testers or to whom this may concern.
We're four engineering students at the Faculty of Engineering,
University of
Lund (LU, http://www.lu.se ; LTH, http://www.lth.se) at the department of
Computer Science. We're working on a paper on test methodology in open
source projects and would be very grateful if you could spare a few
minutes
and answer the following questions.
Our intentions are to send this questionaire to a number of open source
projects and compare their view of testing the project they're currently
developing.
When the paper is finished we're more than willing to share our findings
with you.
Sincerely
Alexander Wallin, Ola Berg, Oscar Hjerpe and Sara Nilsson
Engineering students
Faculty of Engineering
University of Lund
Questions
If you lack answers to any question please write <skip> or leave the
answer
area empty.
(General info about the project)
- How large in lines of code is the project in its current form?
(approximatively) (KLOC)
Derby has 2 repositories, one for the product source code and one for
documentation. Here are the results of running cloc
(http://cloc.sourceforge.net/) on these repositories:
The code repository results:
--------------------------------------------------------------------------------
Language files blank
comment code
--------------------------------------------------------------------------------
Java 2908 166674
373647 634000
SQL 307 12790
17618 42514
XML 130 4420
3772 21223
HTML 45 967
964 8340
Bourne Again Shell 52 651
1364 3131
CSS 1 124
23 709
XSLT 4 45
71 421
DOS Batch 10 99
7 291
Bourne Shell 14 91
250 253
Javascript 1 75
94 246
sed 2 17
59 96
XSD 1 16
14 67
DTD 2 20
43 29
--------------------------------------------------------------------------------
SUM: 3477 185989
397926 711320
--------------------------------------------------------------------------------
The documentation repository: This run missed the actual documentation
source, which is maintained in DITA. The cloc tool didn't analyze the
DITA source:
-------------------------------------------------------------------------------
Language files blank
comment code
-------------------------------------------------------------------------------
XSLT 11 774
992 5416
DTD 12 271
517 498
XML 3 93
65 453
HTML 2 4
28 19
-------------------------------------------------------------------------------
SUM: 28 1142
1602 6386
-------------------------------------------------------------------------------
- How many developers are currently actively developing?
(approximatively)
~15
- For how long has this project been actively developed?
(approximatively)
The project was open-sourced in August, 2004, so it has been actively
developed as an open source project for 7 1/2 years.
Before that, it was a closed-source product called Cloudscape.
Development on Cloudscape began in 1996. So the code itself has been
under development for 16 years.
- Is there any current industry or university backers to the project?
(Monetary and / or developers to the project)
Two large companies contribute teams to work on Derby. Those companies
are Oracle and IBM.
(Test related questions)
- Is there any documents concerning testing?
Documentation on the Derby tests can be found here:
http://svn.apache.org/viewvc/db/derby/code/trunk/java/testing/README.htm?view=co
Such as a test plan, test
strategy or equivalent documents?
Are they actually followed?
Yes.
- Who conducts formal tests? Are there any dedicated testers? In what
form?
Tests are run by the active contributors. Some active contributors are
quality engineers.
- When do you decide to release a new version? Is it feature or date
based?
(e.g. do you release on set dates or when new functionality is added)
Releases are built when a committer volunteers to do the leg work. The
Derby community does not promise to produce releases on specified dates.
However, over the last 7 years, the following release cadence has recurred:
o A feature release in the spring.
o A maintenance release in the fall.
- What kind of testing is conducted before a release? Is it any different
from when it's developed?
As a checkin barrier, committers run an extensive regression test suite.
Other, weekly and monthly stress tests have been run as well.
- Who discovers the most bugs before a release? Testers? Beta testers?
Users?
Before the release, bugs are found mostly by contributors. Users are
asked to test-drive patches and new features.
- Who discovers the most bugs after a release? Testers? Beta testers?
Users?
Bugs are logged by contributors and users.
- Is the majority of bugs discovered before or after a release?
We don't keep metrics on this.
- Test coverage. Is the concept of test coverage important? Do you
have any
statistics concerning this?
Contributors run the EMMA code coverage tool on an ad hoc basis. The
last release for which I have published results was 10.4.1.3:
OVERALL COVERAGE SUMMARY
--------------------------------------------------
all classes: 87% (1226/1404)
Method : 74% (16629/22556)
Block : 69% (486715/704201)
Line : 69% (101022.4/146764)
OVERALL STATS SUMMARY
total packages: 90
total executable files: 1286
total classes: 1404
total methods: 22556
total executable lines: 146764
A recent (unpublished) EMMA run at Oracle produced these numbers:
all classes 93% (1432/1546)
method 73% (17691/24115)
block 70% (519340/745738)
line 69% (107361.4/156389)
- What tools are used during testing?
Ant and JUnit,. In addition, Oracle and IBM drive platform tests using
proprietary, in-house test frameworks.
- Is there any difference between testing during development and during a
release? Are developers required to test using the same tests as is used
during a release?
During release testing:
o The regression test suite is run on a broader range of platforms than
are stressed during development.
o Additional stress tests are run, including large-data and long-running
tests.
o Ad-hoc integration testing of new features is performed by contributors.