Quality Assurance Activities in Object-Oriented Software Development

Size: px
Start display at page:

Download "Quality Assurance Activities in Object-Oriented Software Development"

Transcription

1 Quality Assurance Activities in Object-Oriented Software Development Kunihiko Ikeda, Tetsuto Nishiyama, Kazuyuki Shima, Ken-ichi Matsumoto, Katsuro Inoue, Koji Torii Abstract In OMRON Corporation, we executed quality assurance activities in object-oriented software development and verified its effectiveness. We applied the formal description of use-cases and a design review process to the upper stream of our object-oriented software development process in a closely related manner. The project team consisted of both, software developers and corporate Software Engineering Process Group (SEPG) members. This project structure was considered to help execute effective quality assurance activities. The defect detection rate of project A to which quality assurance activities was applied was two times higher than that of project B to which quality assurance activities was not applied. Furthermore, we verified that the number of defects per function point detected in the downstream of project A was less than that of project B. 1. Introduction In OMRON Corporation, corporate Software Engineering Process Group (SEPG) [5] members have carried out Software Process Improvement (SPI) activities since 1993 to improve the productivity and quality of software development. The percentage of software products using object-oriented technology within OMRON is increasing year by year. Software quality assurance technologies, as well as the notations like the Unified Modeling Language (UML) [1] or the use of CASE tools, should be applied to object-oriented software development. However, in spite of lively discussions about object-oriented software development, there are few case-study reports pertaining to the effectiveness of objectoriented technology in the area of improved software quality [4]. In this paper, we report our quality assurance activities by corporate SEPG members and its effectiveness in objectoriented software development. 2. Object-oriented software development in OMRON In OMRON, we first adopted object-oriented technology in 1995 and have been working toward the component-based software development Object-oriented software development process The object-oriented software development process in OMRON is iterative. Each iteration consists of the following phases: analysis, design, programming, unit test, functional test, and system test (Figure 1). In the analysis phase, the style of use-case [6] descriptions was itemised or in free format. An analysis review would be executed once at the end of the analysis phase by a walkthrough of the use-cases. In the design phase, the design model would be developed with UML using objectoriented analysis and design tools, such as Rational Rose from Rational Software Corporation, and the design review executed once at the end of the design phase. Defects detected during the design review were recorded in review reports. 459

2 Iteration 1 Iteration 2 Iteration N Upstream Downstream Analysis Design Programming Unit Test Functional Test System Test Figure 1: Object-oriented development process in OMRON In the programming and unit test phases, we used Microsoft Visual C++. In the functional and system test phases, test cases for graphical user interfaces were executed automatically using Rational SQA-Suite. At the end of each test phase, review meetings were held to judge the software quality of the phase Quality problem In past object-oriented software projects at OMRON, there was no systematic quality assurance activities in the upper stream of our software development process. Therefore, many defects introduced in the upstream would not be detected until the downstream of the software development process. The reason was that the quality performance of the analysis and design reviews depended greatly on the skills of the reviewers. We measured several object-oriented software projects at OMRON and found that if a defect is introduced into the software in the upstream, the effort to remove the defect within the downstream is up to sixteen times larger than the effort to remove the same defect within the upstream. It is expected that the improvement of the defect detection rate in the upstream by quality assurance activities can dramatically decrease the total effort to remove defects that are detected in the software lifecycle. 3. Quality assurance activities in the upstream We applied the formal description of use-cases and the use of a design review process in a closely related manner to improve the defect detection rate within the upstream. SEPG members defined and deployed new quality assurance activity processes, and monitored process execution Organisational structure The software development organisation in OMRON consists of several software development divisions and the Information Technology Research Centre (ITRC) (Figure 2). 460

3 The ITRC was established to improve the productivity and quality of software development in OMRON. The ITRC consists of corporate SEPG members whose role is to improve the software process in OMRON. SEPG members research new methodologies or CASE tools, define processes to use them, and train developers in OMRON. Development divisions in OMRON are divided according to the product domains. In this case, SEPG members defined the requirements analysis process using use-cases and a review process, and planned and executed a training program for software developers. Furthermore, they participated in the review meetings and monitored the execution of the process. The development division took charge of software development and project management. The SEPG members helped to avoid the confusion of applying unfamiliar technologies and processes to the project. SEPG members had the ability to generate and modify the software process, and to use objectoriented technologies and quality assurance technologies. New technologies Methodologies, CASE tools, etc. searches adopted by OMRON Corporation Information Technology Research Center applies new technologies tailored for the S.D.D. Software Development Division SEPG members Software developers Figure 2: Organisational structure 3.2. Process of introducing new technology SEPG members researched the quality assurance activities of object-oriented development. They customised generic methodologies and related them to current practices within the development divisions in OMRON. This helped the software developers to understand the difference between the past process and the new, and smooth transition to the new process. They monitored the process execution and instructed the software developers Training For the successful deployment, SEPG members executed the quality assurance training for object-oriented development. The trainees were all developers of the project. The training program was a half-day course including use-case descriptions, review points, and discussions pertaining to exercises on describing use-cases Formal description of use-cases 461

4 In past object-oriented developments in OMRON, requirements specifications were described in free format using natural language (mostly Japanese). Due to variations in documentation styles among the developers, the use of natural language often makes it difficult to detect the defects. In order to decrease the ambiguity of the requirements specifications and make it easier to find defects, we described the requirements specifications using formal documentation templates for each use-case (Figure 3). First, the system boundaries are determined from the requirements, and a use-case diagram is created to define the services (use-cases) to be offered by the system. Next, for each usecase defined in the system use-case diagram, the inner state transition of the system, primary and secondary sequences (scenarios), and objects (actors) outside of the system are defined. In primary scenarios, primary sequences of the system are defined. In secondary scenarios, error and exception handling scenarios are defined. If necessary, scenarios can be divided into sub-scenarios. The separation of primary and secondary scenarios makes requirement analysts think through the functions of the system systematically. In order to help visualise the scenarios, complicated sets of scenarios are described with sequence diagrams, and user interface scenarios are described with screen transition diagrams. Formal description of use case Actor Use case Diagram Use case Use case Use case Use case Actor Brief description Pre-conditions Post-conditions Primary scenarios Sub scenarios Secondary scenarios Sub scenarios Figure 3: Formal description of use-cases 3.5. Design review process In past object-oriented design reviews, the review quality was unstable because either the review points were not suited for object-oriented development, or the review points were not defined. We defined an object-oriented design review process that is suited for the objectoriented development process in OMRON. We designed review checklists that contain relationships to items in the specification documents, and used them during review sessions in the upstream (Figure 4). Requirements specifications in the analysis phase consist of two parts. We designed a requirements specifications review checklist for the sections containing use-cases as well as for the other sections of the documents pertaining to other aspects of the system requirements. For example, Figure 5 is an excerpt from the use-case portion of the checklist. Each item in a use-case is explicitly related to items in the checklist. The same was 462

5 done for the software specifications in the design phase. We designed a software design specifications review checklist for the sections containing UML as well as for the other sections of the documents pertaining to other aspects of the software design. 463

6 Analysis Requirements Specifications - Management Requirements, - Technical Requirements - Use-cases, etc. - Development Plan, etc. Design Software Design Specifications - Software Structure, - Interface with Hardware, - Class specifications, - Performance target, - etc. corresponds to reviews er corresponds to reviews er uses uses Analysis Checklist - Management Requirements, - Technical Requirements - Use-cases, etc. - Development Plan, etc. Design Checklist - Software Structure, - Interface with Hardware, - Class specifications, - Performance target, - etc. Figure 4: Object-oriented analysis and design review Use-case diagram 1) Are all analysed actors explicitly defined in the diagram? 2) Are all analysed use-cases explicitly defined in the diagram? 3) Are outside systems and devices that communicate with the system defined as actors? 4) Have all actors involved in informing sudden outside changes been considered? 5) Have all use-cases that generate, save, modify, delete, or acquire information from outside systems been considered? 6) Have all actors that support or maintain the system been considered? 7) Have all use-cases that support or maintain the system been considered? Relationship between use-cases Brief description of Use-case Actor Pre/Post-conditions Scenarios 1) Is the number of primary scenarios in each use-case 5 to 10? 2) Is the number of secondary scenarios in each use-case 5 to 10? 3) Are all scenarios described in adequate detail? 4) Does the result of walkthrough of each scenarios accord with the post-conditions? 5) If necessary, are sequence diagrams included? 6) If scenarios describe user interfaces, are screen transition diagrams included? 7) Is the difference between primary and secondary scenarios considered? 464

7 Figure 5: Excerpt from the use-case portion of the analysis review checklist 465

8 3.6. Measurement system We used review reports in the upstream of software development, and a defect management tool in the downstream. In the upstream, the problem, priority, cause, and solution of each detected defect was recorded in review reports. SEPG members analysed defect data and through interviews with developers, determined the phases in which the defects were introduced. In the downstream, the problem, priority, cause, solution, type, and introduced phase of each detected defect was recorded by the test team or debug team using a defect management tool. Analysing defect data helps to evaluate the effectiveness of the quality assurance activities. In addition to defect data analysis, we find that by doing defect causal analysis, we can find problems related to the software process, and use this knowledge to refine the process [2]. 4. Result In order to discuss the effectiveness of the quality assurance activities, we compare two object-oriented software projects A and B Projects to be compared Projects A and B were both object-oriented software development projects. The quality assurance activities were applied to project A but were not applied to project B. Both projects had the same factors: product domain, development division, development team structure, programming language, software size, and CASE tools. The only major difference between the two projects is the additional use of a defect management tool in project A Metrics We use defect detection rates as metrics to measure the effectiveness of quality assurance activities. Defect detection rate is defined as follows: defect detection rate? the number of defects detected during a particular phase (the number of defects introduced during that phase)? (the number of defects not detected in preceding phases) 4.3. Discussion Table 1 profiles defects introduced and detected in each development phase of project A. Table 2 shows the defect detection rate of each development phase of project A. The defect detection rate of the upstream (analysis and design) of project A was 89%, which was two times higher than that of project B (Table 3). Table 1: Defect profile of project A 466

9 Number of Detected Defects Total Analysis Design Programming Unit Test Functional Test System Test Analysis Number of Design Introduced Programming Defects Unit Test Functional Test System Test 7 7 Total

10 Table 2: Defect detection rate of project A Development phase Defect detection rate (%) Analysis and Design 89 Unit test 80 Functional test 52 Table 3: Comparison of defect detection rate in the upstream Project Quality assurance activities Defect detection rate in the upstream (%) A Applied 89 B Not applied 43 Table 4: Defect removal effort of project A Development phase Defect removal effort per defect (person-hour) Analysis and Design 0.25 Unit test 1.0 Functional test 3.5 System test 4.0 Table 5: Comparison of defect removal effort Project The number of detected defects Total defect removal effort Analysis and design Unit test Functional test System test (person-hour (%)) A (100%) C (141%) Table 6: Comparison of the number of defects per function point Project Quality assurance activities The number of detected defects per FP A Applied 0.41 B Not applied 0.45 For comparison, we calculate how the amount of effort for removing defects would increase if we did not apply quality assurance activities to the upstream of project A. Let us suppose that project C exists, and that the defect detection rate in the upstream of project C is the same as project B, and that the defect detection rate in the downstream of project C is the same as project A, and that the removal effort per defect in each development phase of project C is the same as the result of project A (Table 4). The decrease of the defect detection rate in the upstream causes the increase of defects introduced from one phase to the next. For instance, defects not detected in the analysis phase will remain to be found in the design phase. The total defect removal effort of project C increases by 41% over that of project A (Table 5). We also found that the number of detected defects per function point in the downstream of project A was less than that of project B (Table 6). Quality assurance activities in the upstream contributed to the decrease of defects in the downstream. We used a function point counting guideline for OMRON that was derived from the IFPUG guideline [3]. In addition, we found that the percentage of detected defects pertaining to usability requirements of project A was higher than that of project B. It can be considered that use- 468

11 case analysis made analysts describe requirements specifications thoroughly from users point of view. 5. Conclusions We executed quality assurance activities in the upstream of the object-oriented software development process and verified its effectiveness. We also verified that the quality assurance activities were executed efficiently by SEPG members who supported the software developers. To accomplish the high quality of object-oriented software products, it is not adequate to apply the procedures and the corresponding CASE tools for object-oriented analysis, design, and programming to their projects. The important factor for the success is to define quality assurance activities in the upper stream of object-oriented software development process, and establish organisational structure that makes it a rule to execute the activities. The quality assurance activities can be widely applied to object-oriented software development. In the future, we plan to spread the use of these quality assurance activities to corporate-wide use. We will also develop guidelines on how to tailor the review process to a specific domain, and increase the number of best practice. 6. Acknowledgements The authors would like to thank Satoshi Yamamoto, Yasuji Iwabuchi, Kazunori Ueda, and Takahiro Inoue for their important contribution to the accomplishment of the work described in this paper. 7. References [1] Booch, G., Rumbaugh, J., and Jacobson, I., The Unified Modelling Language User Guide, Addison-Wesley, [2] Card, D.N., Learning from Our Mistakes with Defect Causal Analysis, IEEE Software, Vol. 15, No. 1, pp , [3] Garmus, D. (ed.), IFPUG Counting Practices Manual, Release 3.0, International Function Point User s Group, Ohio, [4] Jones, C., Software Quality: Analysis and Guidelines for Success, International Thomson Publishing, [5] Paulk, M.C., Weber, C.V., Garcia, S., Chrissis, M.B., and Bush, M., Key Practices of the Capability Model, Version 1.1, CMU/SEI-93-TR-25, Software Engineering Institute, [6] Schneider, G., Winters, J.P., Applying Use Cases, A Practical Guide, Addison-Wesley,