DataFinder is a new-paradigm product being produced at a small division of a large software company. Current sales for this division are at roughly $10 Million. The product is sold as a substantial performance enhancement for a RDBMS. We will review the actions and measurements taken during the company’s evolution of their assessment of the readiness of the product ship.
The definition of system test used for the purpose of this discussion is that period of time when the product developers have completed all feature development and most of the code development. There was a “code freeze” milestone in the schedule. The developer activities were limited to bug fixes and reviews of bug fixes before the fix was introduced into the system. The SQA engineers were in high gear, running their tests, finding and reporting bugs, updating tests to account for new bugs or changed behavior.
At the time the case study started, DataFinder management believed they had met code freeze, were in the system test period, and were ready for beta test. In fact, they had already shipped a beta version of the product. However, the customers complained long and loud about the product defects and performance. The Product Development team worked very hard over the next four weeks, addressing the performance and defect issues. DataFinder shipped another beta release. The customers still complained about the product shortcomings.
DataFinder management decided they could not afford to go onto a never-ending four week release cycle, and not really know if the customers would be satisfied with the performance and defects in the software. They chose to reassess the current state of the product. They measured open and close bug rates, percentage of test passing, and test coverage.
Figure 1: Current bug rate
DataFinder engineering personnel were finding bugs faster than developers could fix bugs, a common problem early in the system test phase (See Figure 1).
Figure 2: Test pass percentage
As shown in Figure 2, they also found that more and more system tests were planned, because more features were being added to the software. The features were logged as bugs, but were being worked on as features.
Raw test coverage data suggested that the current suite of 1000 regression tests covered about 30% of the functions. Approximately half the functions may be extraneous. Using this assumption, management decided the function coverage figure was approximately 60%.
DataFinder management had a number of concerns, once they saw these metrics: when were they going to stop finding bugs faster than they could fix them, and when could they stabilize the system test baseline? Based on this data, management decided the feature freeze milestone had not yet been met.
In fact, this data gathering and analysis clarified their thinking about product quality. Initially, the SQA engineers thought low defects were most important. Software developers thought meeting the schedule was most important. After thinking about the data, management decided that the features and performance was most important to the product’s success.
DataFinder management decided to treat this as an opportunity, not a disaster. They selected system test entry criteria, so that they could achieve defined-length system test and beta tests. They would not have to work on four-week chunks to determine if they had a shippable product.
There are critical times to define and review measurements for product shipment:
- During product definition (the requirements and design phase)
- At the beginning of final test (system test)
- At the entry to beta test
- At the end of final test, in preparation for shipment
DataFinder had missed the opportunity to define metrics for the product definition phase, but they were able to set system test entry and exit criteria, beta entry criteria, and shipment criteria.
Original article can be found at: http://www.jrothman.com/Papers/QW96.html
Johanna Rothman consults, speaks, and writes on managing high-technology product development. Johanna is the author of Manage It!’Your Guide to Modern Pragmatic Project Management’. She is the coauthor of the pragmatic Behind Closed Doors, Secrets of Great Management, and author of the highly acclaimed Hiring the Best Knowledge Workers, Techies & Nerds: The Secrets and Science of Hiring Technical People. And, Johanna is a host and session leader at the Amplifying Your Effectiveness (AYE) conference (http://www.ayeconference.com). You can see Johanna’s other writings at http://www.jrothman.com.