While Ruven's story is absolutely relevant, I think the comparison is being made between apples and oranges.
The first example, which did not concern itself too much with documentation or test cases, was not a commercial product. It had only to satisfy the community of amateurs that where writing it, and the community of users that had adopted it. (I use the word amateur here to indicate that they are not being paid for the project in question, not to indicate there level of skill in programming.) The second project, listed below by Ruven, was commercial. The different requirements for an open source project and a commercial one, even if it is in house, are stark. Open source projects have loose, at best, deadlines. The main requirement for success is to satisfy the creator(s). A secondary requirement may be to satisfy the community that wishes to use it. The commercial project will always have a deadline, and failure to meet that deadline will always have consequences. Further there is money on the line. In a commercial project, it is pointless to add features that do not have a clear financial benefit. In an open source project, where all the work is done by unpaid volunteers, it makes perfect sense to add any feature they want. It should not be surprising that projects with such distinctly different goals and resources can arrive at those goals with distinctly different methods. (Volunteer vs. paid labor. Free product vs. for pay. Volunteer Programmer driven vs. Corporate/Market driven.) ----- Original Message ---- From: Ruven E Brooks <[EMAIL PROTECTED]> To: PPIG Discuss <discuss@ppig.org> Sent: Monday, October 1, 2007 8:03:58 AM Subject: PPIG discuss: A different story about how some software got written The following is a true story. I haven't even bothered to change any of the details. 1. The developers were really enthusiastic about .NET technology but they couldn't persuade management to re-write any of the existing products using it. 2. The developers went off and put together a general, all-purpose platform with data management capabilities. It was so general that it could have housed almost any kind of products, including ones far outside of the range offered by the company. It used every .NET feature, including remote databases. (The developers may have thought that they were working for Microsoft.) 3. They showed it to management. Management wanted to know what it was good for. The developers said, it's a product development platform with data management capabilities. It uses all of the latest .NET features. 4. Management asked, could it help customers manage their data? Sure, said the developers. It manages any kind of data. 5. The new product got a catalog number and product manager to manage it and promote it. 6. After five years in development and three years in the market, the product was canceled. No problem with existing customers being upset about the cancellation; there weren't any! Moral of the story: the kind of programmer directed development that Frank Wales references only works if the developers are developing something whose usage the developers understand well, for example, software the developers will be using every day themselves. The usage knowledge requirement also extends to the adoption mechanism, e.g., who will distribute/promote/sell it? For the remaining 99% of the time, a good set of requirements, not specifications, really helps. The requirements needn't be detailed but they must give the developers a really accurate idea of the problem they are solving. Tomorrow, I'll tell a story about agile development in which they didn't bother with documentation, because the code is what is really important, and they didn't bother with test cases until the very end. I'll also tell the true cost of the project, including the cost of "agile recovery." Ruven Brooks