Software metrics, Jekaterina Tšukrejeva, Stanislav Vassiljev, Pille Haug Tallinn University of Technology Department of Software Science Moodle: Software Quality (Tarkvara kvaliteet) Alternate download: tepandi.ee Version 22.11.2017
Software quality and standards Context and content Basic concepts V&V Quality management Software metrics Software quality management Process frameworks IT audit Idea and motivation Quality models and metrics [- Business oriented] - Product oriented - Process oriented - Usage oriented - Data oriented Example: SW dev-nt cost Application of metrics
Idea Metric: defined measurement method and measurement scale Give examples of metrics in everyday life Give examples of metrics in (agile) software development Example: Project velocity - a measure of how much work is getting done on your project (how many user stories were finished during the iteration?) Quality metric: A quantitative measure of the degree to which an item possesses a given quality attribute. Can be measured and/or calculated Product, process, usage, data,... metrics Basis for evaluation of different aspects of quality, eg in development, maintenance etc Is it You can t control what you can t measure (Tom DeMarco) or It is wrong to suppose that if you can t measure it, you can t manage it a costly myth (W. Edwards Deming)?
Why should I use software metrics? In procurement: to predict the project cost; to prepare a realistic proposal;... In project management: to estimate project progress; to decide about project acceptance; to provide input for earned value management;... In development: to evaluate program understandability, maturity, or other quality characteristics In testing: to estimate test coverage; to estimate reliability; to estimate number of errors remaining in the code... In maintenance: to evaluate maintenance cost; to estimate number of personnel needed; to estimate waiting time for customers;... To support management control: to evaluate personnel productivity; to handle projects and investments;... Where are we? Where should we go?
Models Quality models and metrics Quality Characteristics 1:n... Subcharacteristics Metrics Vastavus Täpsus Turvalisus Turvalisus Legaalsus Ühilduvus Turvalisus Turvalisus Kasut. Kasut. lihtsus lihtsus m:n
Which quality characteristics / processes can be supported by software metrics? [Business metrics: measure benefits / risk / resources (eg, measures and targets to monitor the strategic objectives in the Balanced Scorecard)] Software product quality metrics: measure the capability of software product to satisfy stated and implied needs when used under specified conditions Process metrics: measure development and others (see ISO/IEC 12207) Quality in use metrics: measure the extent to which a product used by specific users meets their needs to achieve specific goals with effectiveness, productivity, safety and satisfaction in specific contexts of use Data quality metrics: measure the degree to which the characteristics of data satisfy stated and implied needs when used under specified conditions (SWEBOK, ISO/IEC 25000, ISO/IEC 12207, COBIT)
Product oriented metric examples Length of code - a measure of the size of a program The larger the size of the code of a component, the more complex and error-prone that component is likely to be Length of identifiers - a measure of the average length of identifiers (names for variables, classes, methods, etc.) in a program The longer the identifiers, the more likely they are to be meaningful and hence the more understandable the program Halstead complexity measures, McCabe s Cyclomatic Complexity, SEI Maintainability Index Related to program analysability/understandability Number of function points For estimating project size Number / density of errors remaining in the code (bugs per line of code)
Length of code Source Lines of Code (SLOC) or productivity (SLOC/man-year)? Quality / functionality is more important SLOC value may be computed in different ways SLOC use for evaluation may be conterproductive eg in maintenance, negative LOC/man-year may be a good result Usage in case of fixed SLOC counting standards in certain situations when comparison data is available it provides broad characteristics
Source Lines of Code in COCOMO Only Source lines that are DELIVERED as part of the product are included -- test drivers and other support software is excluded SOURCE lines are created by the project staff -- code created by applications generators is excluded One SLOC is one logical line of code Declarations are counted as SLOC Comments are not counted as SLOC
Models Product quality characteristics (ISO/IEC 25010) Maintainability New environment Compatibility Portability Modification Work Functional suitability Performance efficiency Usability Reliability Security
Product quality (ISO/IEC 25010) ISO/IEC 25010 Functional suitability Functional completeness Functional correctness Functional appropriateness Performance efficiency Time behaviour Resource utilization Capacity Compatibility Co-existence Interoperability Usability Appropriateness recognizability Learnability Operability User error protection User interface aesthetics Accessibility Reliability Maturity Availability Fault tolerance Recoverability Security Confidentiality Integrity Non-repudiation Accountability Authenticity Maintainability Modularity Reusability Analysability Modifiability Testability https://www.iso.org/obp/ui/#iso:std:iso-iec:25010:ed-1:v1:en Portability Adaptability Installability Replaceability
Process oriented metric examples Process/Lifecycle oriented (eg development, maintenance) Coverage of functional/nonfunctional requirements Coverage of equivalence classes / boundary values Percentage of statements / branches /paths covered Reliability Project velocity (XP) Average change request processing time
Analysis of (process) metrics Modification level in specification (D+ M+ A)/O, where D detached functions, M- modified functions, A - added functions, O original number of functions Adequacy between design and specification = (number of functions in design) / (Number of functions in specification). Mean time between failures = (Functioning time) / (Number of failures) (Value range? The best value?) (Value range? The best value?)
ISO/IEC 12207 Information technology- software life cycle processes: versions First version 1995, only software processes => amendments in 2002, 2004 Current version ISO/IEC 12207:2008, 43 software and system processes Version under development - ISO/IEC 12207:2017 2 agreement processes: Acquisition, Supply 6 organisational project-enabling processes: Life Cycle Model Management, Infrastructure Management, Quality Management,... 8 technical management processes: Project Planning, Risk Management, Configuration Management, Quality Assurance, 14 technical processes: Systems/Software Requirements Definition, Architecture Definition, Design Definition, Implementation, Verification, Validation, Maintenance,...
Quality in use metric examples Metrics in Service Level Agreements Some service targets => maintenance and/or operation metrics, eg mean time between failures percentage of a system uptime time to fulfil a service request time to set up a service for a new user time to reinstate a service after a major failure
Quality model for quality in use Effectiveness Efficiency Satisfaction Usefulness Trust Pleasure Comfort Freedom from risk Economic risk mitigation Health and safety risk mitigation Environmental risk mitigation Context coverage Context completeness Flexibility ttps://www.iso.org/obp/ui/#iso:std:iso-iec:25010:ed-1:v1:en
Data quality metric examples Record's field syntactic accuracy: A/B, where - A=number of records with the specified field syntactically accurate - B=number of records (ISO/IEC 25012:2008) Completeness of data within a file: A/B, where - A=number of data required for the particular context in the data file - B=number of data in the specified particular context of intended use Sound data accessibility: A/B, where - A= number of data stored only as sound (e.g. without textual representation) - B= number of data values representing a sound
Data quality Data quality model characteristics (ISO/IEC 25012) Inherent: Accuracy Completeness Consistency Credibility Currentness Inherent + System dependent: Accessibility Compliance Confidentiality Efficiency Precision Traceability Understandability System dependent: Availability Portability Recoverability
Software development cost prognosis needed for example in evaluation of development costs in procurement and bidding ideally: input = software specification, output = development cost / time practically: complicated task what about reliability, security etc? Development environment? Development tools? Level of developers? experience or prior database for comparison is needed planning poker, COCOMO model, function points, SoftStar, etc
Effort prediction 1: Planning Poker Each estimator is given one deck of the cards A Moderator, who will not evaluate, chairs the meeting. The Product Manager provides a short overview + questions + discussion recorded by the Project Manager Each estimator lays a card face down representing her estimate. Units can be duration days, man-days, story points, etc Each estimator opens her card simultaneously by turning it over People with high estimates and low estimates justify their estimate Repeat the estimation process until a consensus is reached. The developer who was likely to own the deliverable has a large portion of the "consensus vote" When the timer runs out all discussion must cease and another round of poker is played http://en.wikipedia.org/wiki/planning_poker https://www.youtube.com/watch?v=paxymek5jy4
Software dev-nt effort and cost prediction 2 COCOMO and other Evaluation of effort needed* Cost evaluation* LoC Function Points Complexity of the task: inputs, outputs, files etc *Empirical models on basis of previous experience: project databases *Even calibrated models can give results that differ from the actual *Literature and software are available, 2014
SystemStar choice of model creation of the tree of components evaluation of LoC, FP cost evaluation, reports, etc choice of programming languages? Databases? http://www.softstarsystems.com/
Introducing metrics Start from the [business] process needs, not from the metrics Define purpose, usage, and type of metric Understand how the metric is used and what are the benefits Understand how the metric is collected and what are the costs Perform cost-benefit analysis Define the metric, analyse if it satisfies the requirements for metrics Define target values, comparison indicators, etc Design and implement processes for metrics data reporting, collection, usage Explain, introduce, improve
Requirements for metrics General - Relevant - Valid - Reliable - Comprehensive - Mutually exclusive Operative - Easy and simple - Does not require additional data collection - Immune to biased interventions
Different organisations / discussion Needs and metrics? IT company Customer company Startup Large organisation Small organisation Discussion on metrics Extensive topic No detailed consensus on classification Relatively little data on usefulness Know the principles Justify cost Use comon sense... looking forward
Software metrics: takeaway Why? Concept, idea, applications What? Content How? How can my organisation benefit? How to implement? Who? Organisation behind. Who might implement? Compare: IT company, customer company, startup, large, small When? When to use it, when not? Advantages, disadvantages? Where? Relationship to other methods Concepts and +/- Metrics for product, product, usage, data[, business] SLOC Function points COCOMO Software cost estimation Tools examples Requirements for metrics Introducing metrics
Additional reading (examples) Daniel Galin, Software Quality assurance from theory to implementation, Pearson - Addison-Wesley. Chapter 21. Ian Sommerville. Software Engineering. Ninth Edition. Addison-Wesley, Ch 24.4. Guide to the Software Engineering Body of Knowledge (SWEBOK), IEEE. Chapter 4 Section 4. Certified Tester Foundation Level Syllabus, ISTQB. Chapter 5.2.5, 5.3.1. Boehm, B. et al. Software Cost Estimation with COCOMO II. Prentice-Hall, 2000 David Garmus, David Herron. Function Point Analysis: Measurement Practices for Successful Software Projects. Addison-Wesley Information Technology Series, 2000