BIO PRESENTATION F1 10/20/2006 10:00:00 AM KEEPING IT BETWEEN THE DITCHES: A DASHBOARD TO GUIDE YOUR TESTING Randy Rice Rice Consulting Services Inc International Conference on Software Testing Analysis and Review October 16-20, 2006 Anaheim, CA USA
Randall W. Rice Randy Rice is a leading author, speaker and consultant in the field of software testing and software quality. Rice, an ASTQB Certified Tester Foundation Level (CTFL), a Certified Software Quality Analyst and a Certified Software Test Engineer, has worked with organizations worldwide to improve the quality of their information systems and optimize their testing processes. Randy has over 28 years experience building and testing mission-critical projects in a variety of environments and has authored over 20 training courses in software testing and software engineering. He is a popular speaker at international conferences on software testing and is also publisher of The Software Quality Advisor newsletter. He is co-author with William E. Perry of the book, Surviving the Top Ten Challenges of Software Testing and Testing Dirty Systems, published by Dorset House Publishing Co. Randy also serves on the board of directors of the American Software Testing Qualifications Board (ASTQB). In 1990, Randy founded Rice Consulting Services, of which he is Principal Consultant and Trainer.
Keeping It Between the Ditches A Dashboard to Guide Your Testing 2006, Rice Consulting Services, Inc.
The Roadmap for This Session Driver s Ed. Revisited The Techniques Understanding the Dashboard Other Things Needed to Keep Testing on Track 2
Driver s Ed Revisited Remember what they told us? Keep your eyes on the road Pay full attention to your driving Fasten your seatbelt Be courteous Don t speed Obey all laws Some of these apply to us in driving software testing projects. 3
A Testing Project is Somewhat Like Driving You need to know your: Destination Location Orientation (Direction) Progress Speed Resource levels (gas, oil, etc.) Engine operation (temp, charge, etc.) 4
The Goal Arrive at the desired destination safely Stay on the road Make good progress Don t get lost Don t run out of fuel Only one driver at a time 5
The Techniques An effective testing strategy Defines the test objectives, scope and approach early in the project. A workable test plan Defines scope, resources, schedules, risks, contingencies, etc. A dashboard Monitors defect levels, test progress, resource levels. 6
The Test Strategy This is a concise, high-level document that communicates for a specific project: The purpose of testing The major test objectives The scope of testing What is unique about this project? Critical success factors The test approach 7
The test strategy helps ensure you are headed in the right direction.
The Test Plan The test plan is a project plan for the test that describes: The test objectives The scope of testing What should and should not be tested The resources needed The schedule and timelines Risks and contingencies Pass/fail criteria General test procedures 9
The test plan is a roadmap for testing that directs you to your destination.
The Test Dashboard Dashboards are nothing new This has been a common topic in articles and at conferences for several years. At the same time, testers often struggle with how to convey accurate and timely information to management. So let s explore dashboards and look at some examples. Then, we ll look at the issues behind test measurement and reporting. 11
The Main Objective To provide simple, meaningful and reliable information in one place to help guide the testing effort and convey that information to our clients. 12
What is a Testing Dashboard? A testing dashboard, just like a car s dashboard, is a set of indicators that show the current status of testing. Dashboards can be seen from various perspectives: Project Testing Ongoing system maintenance 13
Why Have a Testing Dashboard? For fast and easy reporting test results to management To have all of your testing information in one place To help guide the testing effort To help make good decisions To build project learning Better estimates in the future To build the credibility and visibility of testing 14
What is Required for a Dashboard? Accurate and meaningful measurements and metrics A culture of trust and openness Non-intrusive ways to measure Ideally, the measures should come from activities already being tracked. Defect tracking systems 15
Types of Dashboards Low-tech White boards Spreadsheets Elaborate Allows input from distributed teams Examples DART (http://public.kitware.com/dart/html/index.shtml) SPAWAR (U.S. Navy) 16
What is Shown on a Typical Testing Dashboard 17 Coverage Requirements Functions Test case Status Of testing Defect resolution Readiness for deployment Progress Based on test goals and objectives Blockages Risk Technical Business Project
What Makes a Good Metric? Simple Easily measured and understood Can be automated So we don t have to take time to measure and record manually Also, people don t get the chance to manipulate the measures Meaningful We can gain useful information to make decisions 18
Examples of Dashboards
Low-Tech 20 Source: James Bach, A Low Tech Testing Dashboard, http://www.satisfice.com/presentations/dashboard.pdf
Low-Tech Legend Area This can be any function, feature or attribute to be tested. Effort Ranges from none to ship Coverage Ranges from 0 (no info) to 3 (most rigorous) Quality Assessment Summary of status Comments Any additional helpful information 21 For more details on each dashboard item, see A Low Tech Testing Dashboard, by James Bach - http://www.satisfice.com/presentations/dashboard.pdf
Low-Tech Dashboard Considerations Keep the number of areas under 20 or so. Also, try to be consistent in the value of areas Area names should be easily understood Color coding helps Red: Bad stuff happening Yellow: Warning Green: Looking good Frequency at each build 2 5 per week 22
Low Tech Dashboard Advantages Easy to implement Easy to understand Easy to update Challenges Distribution of information Making sure it doesn t get erased 23
Spreadsheets 24 Adapted from, By the Dashboard Light Providing Information, Not Data by Johanna Rothman, www.stickyminds.com
Spreadsheets (2) Easily maintained Easily distributed Not limited by board space New columns can be easily added 25
Randy s Graphical Dashboard 75% 65% 80% 70% % Requirements tested % Requirements tested & passed % Test cases executed % Test cases passed Efficiency Correctness Usability 50% 67% Maintainability Security 26 % Open defect reports % Tester utilization Compatibility Portability Interoperability
Kiviat Charts Correctness Each of these areas are desired attributes of the application or system. Integration Reliability Each ring shows the relative score for each attribute. 100% 80% 60% 40% 20% Portability Usability Example: Six of ten usability tests have passed. 27
Kiviat Charts (2) Easily understood Show at a glance: Coverage Strengths and weaknesses Can be a part of other dashboards Are great to show when you don t have much time in a presentation. 28
Expanding the View Project Dashboards Have the same characteristics, but more points of measurement. Contain testing measures. Guide the entire project, not just testing. 29
Sample Project Dashboard 30 Source: U.S. Navy - sepo.spawar.navy.mil/metrics.ppt
Sometimes You Need A Navigator I use a Garmin StreetPilot 2820 GPS navigator. You might need a mentor, a consultant, etc. Don t be afraid to ask for help. However, the guide could be wrong! 31
Words of Warning 32 Too many items on a dashboard can be distracting and confusing. Unless you are flying a plane! Metrics can be abused. If people don t understand human behavior, more harm than good can result. Stuff happens. Things not shown on your dashboard can derail your test.
Other Concerns Is the testing process working as desired? Are we keeping our eyes on the objectives? Are there multiple people trying to drive at one time? Are we avoiding the potholes? 33
Keeping the Process Working The dashboard tells you about vehicle (process) malfunctions. In testing, the process is the engine. The process might not be documented. How you perform the process determines whether or not you reach the intended destination. 34
Keeping Your Eyes on the Road Test objectives and plans keep us on track If we pay attention to them! 35
Dealing With Multiple Drivers There can be team input, but ultimately only one leader. When things start to go wrong, people tend to lose confidence in the leader. This means the leader must: Make the right corrections Reassure the team Seek input from the team Be decisive 36
Common Potholes Excessive defect levels Out of scope distractions Unexpected changes Reorganizations Application/System changes Project changes Team strife Be courteous 37
Final Thought A key purpose of testing is to provide meaningful information to management to make informed decisions. 38
Other Resources By the Dashboard Light - Providing Information, Not Data by Johanna Rothman www.stickyminds.com A Low-Tech Testing Dashboard by James Bach http://www.satisfice.com/presentations/da shboard.pdf 39