How The HP-UX Systems Networking and Security Lab Assures Quality for Release 11i v2

Size: px
Start display at page:

Download "How The HP-UX Systems Networking and Security Lab Assures Quality for Release 11i v2"

Transcription

1 How The HP-UX Systems Networking and Security Lab Assures Quality for Release 11i v2 ESG/BCS/EUD/SNSL/SWEET Hewlett-Packard Table of Contents ABSTRACT 2 1. BACKGROUND AND INTRODUCTION Goals Of This Paper 1.2 CMM Overview ANALYSIS AND EXPERIENCE Initial State Management Support Planning and Tracking CMM Implementation Updating Process Assets Coaching Assessments Infrastructure to Institutionalize Quality Performance 5 3. RESULTS Feedback from Managers and Engineers 3.2 Product Results CONCLUSION 7 REFERENCES: 7 GLOSSARY 8 1

2 Abstract For a long time, HP software labs have fostered a culture of quality. HP was one of the first commercial companies to work with the Software Engineering Institute as they piloted the Software Capability Maturity Model (CMM ). Within the HP-UX operating systems organization, the networking software lab of about 150 people chose the CMM as the means to achieve quality products and project stability. This paper describes the two and a half year journey that the lab took to now operate at CMM level 3. The lab incrementally instituted a comprehensive quality process infrastructure. Essential elements to the lab s success include: Active planning and deploying of various CMM elements Active solicitation and use of senior management support Software Quality Assurance (SQA) coaching as the means to institute process changes across all projects in the lab Internal CMM-based assessments as a means to verify CMM implementation and a means to identify best practices to share across the lab Enough time has passed for the lab to see improvements in product quality as well as productivity. These improvements will help customers more easily adopt the next HP-UX release, Release 11i v2. 1. Background and Introduction As the Internet became crucial to businesses in the late 1990 s, the networking lab of the HP-UX operating system, along with most of the other labs, decided to enhance quality and productivity in order to better serve the new technological needs of the customers. The networking lab manager was authorized to significantly increase the staff of the lab. The lab almost doubled in size from early 1999 to the end of If not managed and trained properly, this large influx of new personnel could have caused a decrease in quality and productivity if they poorly followed good processes. To offset this risk, in late 2000, the lab manager communicated the goal of having the lab operate at CMM Level 2, plus the Peer Review Key Process Area, by the end of He created a group of four process engineers to drive the implementation of the various CMM principles throughout the lab. This team, the SoftWare Engineering Effectiveness Team (SWEET), helped lead the lab to achieve the stated goal of operating at CMM Level 2 in December 2001 as determined by an external assessor. In January 2002, the lab manager set a new goal of having the lab operate at CMM Level 3 by the end of This goal was also achieved. 1.1 Goals Of This Paper The goal of this paper is to show to customers and others the results of the long-term quality goals set by an HP-UX lab as they developed new products, maintained existing ones, and created new HP-UX releases. The lab is well staffed with senior technical engineers that assure the products have the best technological content that is tested thoroughly. However, this paper focuses on the process maturity used by all in developing, integrating, testing and maintaining the software. The process maturity used is also essential to ensuring a usable and reliable product. The lab s experience as voiced by the practitioners has shown that increased process maturity has mitigated the risk to product quality that could have been caused by the staff increase in , continual personnel turnover, and the reduction in force that came in 2001 and CMM is registered with the U.S. Patent and Trademark Office. 1.2 CMM Overview The CMM is a framework to assess the capability of your software organization (Ref 1). The practices of your organization are assessed against a set of 18 categories of best practices called Key Process Areas (KPAs). The KPAs are organized into four maturity levels in prerequisite order. The KPAs in the first level (level two level one is an ad hoc level) are recommended to be satisfied before proceeding to the next level. For example, management practices need to be stabilized before effort is spent in stabilizing, improving, and/or automating the engineering practices. For this study, projects were assessed against maturity levels two and three KPAs only. The 2

3 following thirteen KPAs were in that set: 1. Requirements Management (RM) 2. Software Project Planning (SPP) Level 2 Level 3 3. Software Project Tracking and Oversight (PTO) 4. Software Subcontract Management (SSM) 5. Software Quality Assurance (SQA) 6. Software Configuration Management (CM) 1. Organization Process Focus (OPF) 2. Organization Process Definition (OPD) 3. Training Program (TP) 4. Integrated Software Management (ISM) 5. Software Product Engineering (SPE) 6. Intergroup Coordination (IC) 7. Peer Reviews (PR) See Reference 1 in the Reference section below for more information about the CMM. 2. Analysis and Experience 2.1 Initial State A very significant key to our success was the strong lifecycle foundation that was laid before most of the SWEET team arrived. Without it, the progress would have taken a great deal more work to achieve. At the heart of this foundation was the Product LifeCycle (PLC) web-based tool, which was already built with CMM principles in mind. This tool stored a default set of tasks, deliverables, metrics and checkpoint reviews. When a project manager would begin a new project, they would create an instance of these tasks, deliverables, etc that could be stored and edited as a web page. Project managers could assign commit dates to each task and then track when the actual date would occur. The project manager could even edit his instantiated list to tailor it to the specific needs of the project. Therefore, at a high level, the tailoring of organizational process assets and the planning and tracking elements of the CMM were already a part of the lifecycle culture. However, project managers did not consistently use the PLC tool across the lab. New projects used the PLC; but maintenance and third party porting and/or integration projects did not use it as much. Many project managers were creating content and format of the various deliverables and checkpoint review briefings from scratch each project cycle. This used up a lot of their time. A co-incident but separate effort to bring peer review to a higher level of adoption and rigor in the lab was also a key foundational element. 2.2 Management Support As mentioned before, management support was solid from the top. The lab manager communicated the goal of operating at CMM level 2 for 2001 and level 3 for 2002 to the whole lab. The middle level managers, that is, the section managers, set up a weekly meeting in which they would work various quality practice adoption issues. In mid 2000, the section managers conducted an unofficial initial assessment to see what gaps the lab had in complying with the CMM. The lab manager and section manager instituted a terse set of policies for lifecycle work. All projects and even patches needed to have checkpoint reviews at the various phases of the lifecycle with senior management before being released to the customer. A Bimonthly Management Review was established in which all projects; including research type projects (i.e., in which a product is not necessarily delivered) give a status to the senior management. Project managers and senior engineers now had to be accountable to senior management for the functionality, effort, cost, quality and schedule of their projects before delivering to the customer. The Bimonthly Management Review helped institute the need for continual tracking of projects. This really helped long term projects that had lifecycle phase checkpoints spread far apart. Project managers and engineers can see the commitment of their superiors to improving quality using the CMM. This helped the project managers and engineers adopt the various CMM practices without too much resistance. 3

4 2.3 Planning and Tracking CMM Implementation The SWEET team of process engineers created a plan of what CMM practices would be deployed to the lab when. This was based on the section managers initial gap analysis assessment. The improvement plan was based on the initial assessment done by the section managers. For example, the lab had a good culture of tracking projects; however, defining and control changes to requirements needed a lot of work. The SWEET team spent considerable effort in defining processes for requirements definition and management that were communicated to the lab. The lab is now consciously ensuring requirements are traceable throughout their design, code and test phases of the lifecycle. Various improvement actions had their necessary process assets developed by SWEET. The section managers were then solicited for feedback and approval. Then these assets were communicated to the lab. SWEET followed this sequence of working the improvement plan for two years. 2.4 Updating Process Assets The structure of the PLC lifecycle tool facilitated the adoption of CMM best practices. However, the content of the tasks, deliverables, etc., needed a major update to ensure all CMM practices would be adopted. It was key to have the PLC hide most of details of the CMM from the users. Their job was to follow the PLC and continue to use PLC terminology, which is part of this lab s culture. The lab staff should not be forced to learn CMM specific terminology. With that principle, the PLC tool was augmented with tasks and deliverables specific to third party code and maintenance lifecycles as well as missing CMM practices. The updates were based on surveying the best practices across all projects and synthesizing them into a viable set using the local terminology of the lab. Perhaps the key process assets used by project managers were the checkpoint review templates. Once the project managers realized that these briefings had to be presented to senior management, SWEET began to populate the content of the briefings with information that required CMM practices to be accomplished. For example, requirements were presented with an ID number for ease in future traceability. Measurements of effort, size, quality and schedule had to be estimated and updated with actuals over time. Risk management and organizational learning activities also had to be briefed. By shaping and refining these checkpoint review templates, they provided additional training for both the presenters and the management approvers on quality issues. When the complete set of lifecycle process assets were updated the SWEET team communicated the change to all levels of management and engineers. Even with this one time communication event, the SWEET team realized that the project managers and other users of the process assets would need a more continual consulting arrangement in order to better adopt the lifecycle. Software Quality Assurance (SQA) coaching was born. 2.5 Coaching Two of the four SWEET process engineers were assigned to work closely with all project managers to ensure the various CMM practices were adopted. Using the SQA Key Process Area set of practices as an initial guide, these two SQA Coaches worked with half the project managers each. They used weekly one on one sessions to slowly but deliberately communicate all of the elements needed for a project to operate at CMM level 2 and later CMM level 3. The following practices are a sample of what was coached: Creating, tailoring and tracking project-specific Product Lifecycles Stepping through all of the elements needed to present at a checkpoint review using the new review templates Creating a quality plan and a software configuration management plan at the beginning of the project Ensuring subcontracted tasks are properly planned and tracked and that products received from subcontractors are properly accepted. Posting work artifacts, such as, completed deliverables, plans, root cause analyses, measurement reports, etc., in a common repository used by the whole lab. This would help in the creation of a Process Asset Library (PAL) and a Product Engineering Framework (PEF). The PAL and PEF were used to make management as well as engineering best practices obtained from individual projects accessible to the rest of the lab. Training the projects in the need to learn from the experiences and best practices of other projects as well as contribute their best practices to the PAL and PEF repositories for the benefit of others. Over time, the lab migrated from having the SWEET team being the creator of process assets. Now, the SWEET team facilitates key managers and engineers to create process assets, especially engineering materials like design 4

5 and test document templates. The lab as a whole began to take ownership for quality performance as opposed to quality being driven by the small SWEET team. 2.6 Assessments To complement the continual SQA coaching of the project managers, a separate SWEET process engineer conducted internal assessments of each project. Reference 2 was used as a guide. The goal was to find what areas of the CMM a project excelled in as well as requirements of the CMM a project was non-conformant in. The deliverables and other work artifacts were assessed and the project team as a whole was interviewed using the CMM abilities, commitments, key practices, measurements and verification requirements as the criteria. The project manager would provide feedback on a draft report. The final report, with recommended actions of resolving nonconformances would be passed on to the coaches to work with the projects. The use of assessments helped the SQA coaches and project managers focus on specific areas of the quality lifecycle that needed improvement. An interesting by-product of having one person assess every project was that the lab could finally get a comprehensive set of best practices being used by the projects. In the past, the individual projects were too burdened with their specific tasks to communicate a best practice to the rest of the lab. 2.7 Infrastructure to Institutionalize Quality Performance Each of the major actions accomplished as described above created the infrastructure needed to institutionalize quality performance for the lab. Figure 1 below can help someone visualize all of the quality activities going on in the lab at the same time. LM is Lab Manager. The Quality Manager is the manager of the SWEET group. PM is Project Manager and ENG is Engineer. One key to this figure is to realize that most of this infrastructure did not exist before the CMM implementation effort began. Figure 1. Infrastructure to Institutionalize Quality Performance CMM Implementation Plan & SQA Coach Checklist Assessor SQA Coaches Quality Manager Assess & Track Non-conformances Get Approval to Implement CMM Items Coach PMs & Engineers to use Processes SECTION MANAGERS PM PM PM PM ENG LM Reinforce Adoption Bimonthly Review to SMs & LM Use & Contribute to Process Assets ENG Communicate Goals 3. Results Lab Process Assets: PLC, PAL, PEF CMMCMM Encapsulated 3.1 Feedback from Managers and Engineers The adoption of the CMM, as described in the narrative above, produced a lot of responses from the managers and engineers. Over time, as the lab became comfortable with using the various CMM practices as part of their job, they began to see the benefits. Even key senior engineers, who was used to working the old way, began to express praise for the focus on quality. Here are some typical comments that reveal the attitude of managers and engineers 5

6 towards using the CMM. From a Program Manager, Following the process is time consuming, but it makes you think in advance about resources and tasks that you might forget. The documented commitment is important. The process makes you think ahead. We usually think one step at a time. From a Project Manager, The new engineer responsible for these activities leveraged the existing processes and documents for the new project quickly with little interaction from the original project engineer. From a Project Manager, It took 5 days to do the first Test Results document but only 2 hours to do the second one. Everybody knows exactly what they need to do. We reuse the process which feeds the documents which are reused as well. From another Senior Engineer, A problem that has troubled me for many years has been the inability to find the necessary materials for each phase of the product lifecycle. Items such as document templates, standards, checklists, policies, and guidelines were not kept in a single place (as far as I could tell). I now find that the approved forms of the above are placed into the product engineering framework. This is now my one-stop shopping for product lifecycle answers relevant to me. From a Senior Engineer, The Product Engineering Framework web-page has been a very effective way to communicate best-practices across the lab. Also, by doing inspections, people have become more aware of the importance of quality. As a side effect of doing inspections, both design-level and code-level documentation has improved. 3.2 Product Results The lab realizes that customers are focused on results more than internal processes. The adoption of the CMM and a quality culture as a whole has contributed substantially, but not totally, to the following product results: In 2002, the lab resolved over 900 reported code maintenance defects. The lab resolved about 600 defects in each of the previous two years. The focus on following a consistent process has enabled engineers to focus on technical content rather than recreating form for each cycle. The culture of quality spread from being led by the SWEET team to being led by each engineering section of the lab. Each section is now renovating and/or substantially upgrading their product test suites as well as using various static and dynamic code analysis tools to find as many defects in code before being released. Each section has an active self-managed cross-training program in which projects learn about the technical attributes of their fellow projects in the section. The culture of quality is now ingrained in the managers. They now withhold releasing a product until all possible inspection and testing steps are done as opposed to rushing to release a product regardless of quality risk. New third party products now go through a consistent rigorous process so that all levels of lab management are aware of the performance attributes of the product in an integrated HP-UX environment. The lab now knows their quantitative productivity and estimation capability. For example, the lab knows on average how many new lines of code per effort month it can produce; and the average percent of effort that is underestimated at the beginning of the lifecycle compared to the actual effort used. This enables the lab to make more realistic commitments to the customer. 6

7 4. Conclusion It is important to look at the culture of quality and its impacts as a whole. The CMM helped show the engineers and managers the time-savings realized by following a consistent, measured and continually improving lifecycle process. The staff is now beginning to take ownership of the culture of quality. The infrastructure set up to institutionalize quality (management reviews, actively maintained process assets, coaching, etc.) will ensure the culture will not disappear if a few key people leave the lab. Not only does the lab now know its quantitative capability, a lot of form related rework has been eliminated. This has given the engineers and managers more time to focus on technical content, as observed by the capability to resolve more maintenance defects and deliver new products that have been thoroughly inspected and tested. Engineers who now have a positive view towards the culture of quality are more likely to produce higher quality products. References: 1. M. Paulk, et al., Capability Maturity Model Version 1.1, IEEE Software, July D. Dunaway, S. Masters, CMM-Based Appraisal for Internal Process Improvement: Method Description, CMU/SEI-96-TR-007, April W. Humphrey, Managing the Software Process, Addison-Wesley, Reading, Ma J.G. Brodman and D. Johnson, Return on Investment from Software Process Improvement as Measured by U.S. Industry, Crosstalk, Apr (pp ). 5. J.D. Herbsleb and D.R. Goldenson, A Systematic Survey of CMM Experience and Results, Proceedings ICSE 18, IEEE Computer Society Press, Los Alamitos, Calif., B. Clark, The Effects of Software Maturity on Software Development Effort, Ph. D. Dissertation, University of Southern California, August F. McGarry, R. Pajerski, G. Page, Software Process Improvement in the NASA Software Engineering Laboratory, SEI TR , December K. Butler, The Economic Benefits of Software Process Improvement, Crosstalk, Hill AFB, Ogden, Ut , pp J. Herbsleb, et. al., Benefits of CMM-Based Software Process Improvement: Initial Results, SEI TR 94-13, August S. Burke, Radical Improvements Require Radical Actions: Simulating a High-Maturity Software Organization, SEI TR , June F. McGarry, S. Burke, W. Decker, Measuring Impacts of Software Process Maturity in a Production Environment, Proceedings of the 22 nd Software Engineering Workshop, December

8 Glossary CMM Common Features: Attributes of a key process (best practice) that are needed to institutionalize the adoption of the best practices. Commitments (management policy), Abilities (resources and training), Measurements, and Verification are the CMM attributes that comprise the Common Features across all Key Process Areas. Key Process Area: A best practice in either engineering, management or organizational disciplines that the CMM recommends an organization adopt. Requirements Management, Training, Software Project Planning, and Organizational Process Database are some Key Process Areas of the CMM. Key Practice: The activities in a key process area specific to the best practice. These are contrasted to the Common Features. For example, An estimate of the size of a project is made according to a documented procedure is a specific Key Practice of the Software Project Planning Key Process Area. 8