of Your QA Organization Michael Hoffman 19 October 2010 PNSQC 2010

Size: px
Start display at page:

Download "of Your QA Organization Michael Hoffman 19 October 2010 PNSQC 2010"

Transcription

1 of Your QA Organization Michael Hoffman 19 October 2010 PNSQC

2 One software QA organization s effort to evaluate its own health through establishing goals, benchmarking, defining organizationlevel metrics, and on-going self-assessment. 2

3 The company Large technology company providing products and services that span consumer to enterprise. The Product Line High-volume, consumer electronics 50+ unique product variants delivered each year The software QA organization 4 geographically-separated teams, plus vendor sites ~35 engineers ~110 testers Where I fit in Senior SW Quality Engineer 3

4 Educating a new senior executive Our shtick Dozens of products validated each year Consistently delivered to schedules Executed within budgets Very low defect escape rates Numerous efficiency-focused initiatives Result: We were convincing that we delivered value He had just three simple questions 4

5 How do you know you re doing the right thing? What he really meant Prove you are doing what the business needs 5

6 How do you know you re improving over time? What he really meant Prove you are getting better and better at doing it 6

7 How do you compare with similar QA organizations? What he really meant Prove you are world class 7

8 What constitutes improving over time? Tester - quicker differentiation between improper and expected behavior. Customer Support Analyst - less issues found in the field. Test Engineer - optimizing test coverage based on risk assessment. Quality Engineer - Identifying issues in earlier lifecycle phases. QA manager - making sure the team has the proper skillsets and tools QA Director - doing more with less 8

9 The plan Take stock Benchmark Understand where we stand Define measures Do a first assessment Set thresholds and actions Continue to focus on the right thing 9

10 Goal: collect and organize key categories of information about our organization Category Organizational Model Relationship to Customer Roles and Responsibilities Product Assessment Process Assessment Organizational Assessment Organization Performance Description How it impacts objectives, priority-setting, and decision-making. How SQA is assessed and held accountable for its responsibilities. SQA roles, their respective responsibilities, and the partnerships with external teams. How product readiness is assessed. How the effectiveness of quality/test processes is assessed. How the health of the quality organization is assessed and improvedupon. How the quality organization is measured by the company. 10

11 A realization of existing self-assessments Some obvious self-assessment gaps A starting point for benchmarking conversations The basis for information we planned to share 11

12 Goal: learn how other software QA organizations do self-assessment Make-up of participating companies: Consumer electronics Business technology Defense contractor Service/logistic sector We sought mutually-beneficial information sharing 12

13 Category Organizational Model Relationship to Customer Roles and Responsibilities Product Assessment Process Assessment Organizational Assessment Organization Performance Topics Organization structure Management of outsourcing Customers view of the quality organization Relationships with partner organizations Assessing satisfaction Roles/responsibilities within SQA Assessing/tracking performance On-going individual development Tracking the health of products Assessing product readiness Historical/predictive analysis Lifecycle processes Proactive quality processes Assessing process effectiveness Measuring the right things Responsibility for organization health Assessing/measuring organization health Alignment of organization goals Measuring organization performance Effective metrics at an organizational level 13

14 Identifying differences was straight-forward Identifying commonality wasn t Considerable variance in approaches due to Product technology Company commitment to QA function Maturity of their SQA organization Resource constraints We utilized a subjective, consensus-based approach for correlating benchmarking data 14

15 Analyzing benchmarking results 15

16 Thesis Apply your best management sensitivity to the interpretation and use of software metrics data Functional Management Set clear goals and get your staff to help define metrics for success Support your people when their reports are backed by data useful to the organization Don t allow anyone in your organization to use metrics to measure individuals Understand the data that your people take pride in reporting, don t ever use it against them, and don t ever even hint that your might. Don t emphasize one metric to the exclusion of others Project Management Provide regular feedback to the team about the data they helped to collect Know the strategic focus of your organization and emphasize metrics that support the strategy in your reports Don t try to measure individuals Gain agreement with your team on the metrics that you will track, and define them in a project plan Provide regular feedback to the team about the data they help collect Project Team Do your best to report accurate, timely data Help your managers to focus project data on improving your processes Practical Software Metrics for Project Management and Process Improvement Robert B. Grady 16

17 Common metrics from benchmarking partners Number of defects per lines of code Test effort (cost) per defect found Mean time to fix / validation Number of re-opened defects Ratio of automated vs. manual test executions % line of code covered Percentage of test plan reviews Number of escapes Number of quality audits 17

18 Focusing on our company s goals Financial Customer Employee Operational Efficiency 18

19 Financial Defined at the corporate level Budget performance, warranty costs, Customer Quantitative field & beta escapes Qualitative - deemed out of scope 19

20 Questions to answer: 1. Do we have the right skill sets? 2. Are we prepared for future needs? 3. Are employees satisfied? New initiatives to determine Skill level vs. roles Capable resources vs. future need Instigated semi-annual survey to assess employee satisfaction 20

21 Categories Readiness Fulfillment Effectiveness Efficiency Initiative Status A measure of our organization s ability to Meet delivery milestones. Deliver on what we promised. Do the right thing. Use less resources over time. Make timely progress on improvement efforts. Metrics Specification readiness Test case readiness Product readiness # of escalations # of waivers Defect removal Spoilage Effort per defect Defect merit Varied by initiative 21

22 Goals Understand the necessary data acquisition processes and tools Validity each metric Create a baseline Realization The necessary foundation was not there for some areas 22

23 Focus Identify short-term solutions Define longer term desired capabilities Spurred initiatives Enhanced tools capabilities Cultural change Result Some areas would be a work-in-progress 23

24 Goal: defining acceptable results and corrective actions Challenge: how to normalize measurement across vastly different metrics Meaning Current measures are within target threshold Trending is in the right direction Current measures are within target threshold Trending is in the wrong direction or Current measures are moderately outside target threshold Trending is in the right direction Current measures are significantly outside target threshold or Currently measures are moderately outside target threshold and trending is in the wrong direction Action None needed Minor corrective action needed Major corrective action required 24

25 Key observation Significant metric interdependencies Realization Qualitative results are not sufficient by themselves A narrative provides necessary interpretation Quantitative results + a narrative Identifies where stand Shows if it s the right thing to do Illustrates if we re getting better over time which is what this endeavor was all about 25

26 Question: what should be the frequency of metrics generation and assessment reporting? Factors Product development cycles Product release dates Product field life Financial milestones What didn t work Generating based on fixed dates Reporting at product lifecycle milestones Our approach Generate the metric when it made sense Report out the overall assessment regularly 26

27 Ever-increasing demands Expanding feature sets New technologies Limited budgets Quicker time to market Complacency leads to lower quality over time Continuous innovation in approaches SQA is a necessity An means of doing organization assessment is one approach at understanding whether if you re focusing your efforts in the right place to meet the needs of your business. 27

28 28