FROM DATA TO INSIGHT. IT Confidence 2017, Beijing TRENDS IN ISBSG DATA COLLECTION & SOFTWARE DEVELOPMENT. 20 th September 2017

Size: px
Start display at page:

Download "FROM DATA TO INSIGHT. IT Confidence 2017, Beijing TRENDS IN ISBSG DATA COLLECTION & SOFTWARE DEVELOPMENT. 20 th September 2017"

Transcription

1 FROM DATA TO INSIGHT TRENDS IN ISBSG DATA COLLECTION & SOFTWARE DEVELOPMENT IT Confidence 2017, Beijing 20 th September 2017 Thomas Gordijn, ISBSG Repository Manager

2 OVERVIEW Introduction Size Matters Insight Findings Project Phase Ratios Insight Findings ISBSG Data Collection Trends

3 INTRODUCTION Name: Thomas Gordijn MSc Current Senior Consultant Benchmarking, METRI Repository Manager, ISBSG Past Data Manager, National Intesive Care Evaluation (NICE), AMC Studied Medical Informatics, University of Amsterdam Contact ISBSG: METRI: ISBSG: METRI:

4 ABOUT ISBSG Organization The ISBSG is a not-for-profit organization founded in 1997 by a group of national software metrics associations. Goal ISBSG is an independent international organization that collects and provides industry data in order to help all organizations (commercial and government, suppliers and customers) in the software industry to understand and to improve their performance; ISBSG sets the standards of software data collection, software data analysis and software project benchmarking processes. Help us to collect data ISBSG is always looking for new data. In return for your data submission, you receive a report that shows the performance in your project or contract against relevant industry peers; Please submit your data through one of the forms listed on: Alternatively submit you raw data for processing (For example Excel format). Partners This page will help you to find an ISBSG partner in your country:

5 About METRI Benchmarking Research Started out in Benchmarking More than 14 years of history/track record Market leader in Benelux International focus USP: Component Based Measurement Fact Based Research Market research and knowledge projected to your daily challenges Using METRI s Eco System for reflection and verification IT Advisory IT Governance Expanded services building upon proprietary benchmarking database Service -, Cost-, Performanceand Value management IT sourcing: strategy, selection, contracting, transition & provider relationship management Quick scan of IT function Target operating model Governance digital transformation Benchmark IT workforce

6 SIZE MATTERS IT Confidence 2017, Beijing

7 INSIGHT - SIZE GROUPS Size Groups - Relative Size Size Groups Relative size Abriviation Functional size (FP) Extra extra small XXS => 0 and < 10 Extra small XS => 10 and < 30 Small S => 30 and < 100 Medium 1 M1 => 100 and < 300 Medium 2 M2 => 300 and < 1,000 Large L => 1,000 and < 3,000 Extra large XL => 3,000 and < 9,000 Extra extra large XXL => 9,000 and < 18,000 Extra extra extra large XXXL => 18,000 Insight Programming languages Java Visual Basic C C++ C# Delphi Visual C++ PHP JavaScript Pascal Python Pre-classifying applications based on functional size might lead to wrong grouping of applications and comparisons; Currently and in the past organizations create default application size groups in order to select applications for comparison/benchmarking purposes; ISBSG also includes relative size in the repository and corporate releases.

8 Difference compared to median INSIGHT - PRODUCTIVITY & FUNCTIONAL SIZE Productivity difference compared to median of the complete dataset S 10% S 20% S 30% S 40% S 50% S 60% S 70% S 80% S 90% S 100% Difference

9 FINDINGS - PRODUCTIVITY & FUNCTIONAL SIZE Size groups Size (FP) Productivity (Hours/FP) Groups Categories N Min Max P25 Median P75 S 10% ,4 12,6 38,3 Group 1 S 20% ,4 15,3 30,4 S 30% ,0 13,9 30,2 S 40% ,7 13,3 26,8 S 50% ,7 9,8 21,1 Group 2 S 60% ,3 10,5 17,1 S 70% ,5 10,0 24,8 S 80% ,9 9,3 19,9 Group 3 S 90% ,8 6,9 12,3 S 100% ,2 5,9 10,0 Total ,3 10,2 22,9 Findings The results suggest that three categories of size exist for the total range of projects up to 10,571 FP. Group 1 (< 135 FP) is less productive than overall median, Group 2 ( FP) median productivity and Group 3 (> 515 FP) with better productivity than the overall median. Development teams working on projects with a functional size larger than approximately 515 Function Points (FP) are more productive. Development teams working on projects with a functional size smaller than approximately 135 FP are less productive;

10 FINDINGS - PRODUCTIVITY & FUNCTIONAL SIZE Findings Different development teams working on projects of different functional sizes cannot necessarily be compared when analyzing productivity. When judging productivity of a development team it should be compared to teams working on projects with the same functional size.

11 PROJECT PHASE RATIOS IT Confidence 2017, Beijing

12 INSIGHT - PROJECT PHASE RATIOS - GENERIC TREND Project Phase Ratios - New development Project Phase Ratios - Enhancement 100% 90% 80% 70% 6% 9% 16% 16% 100% 90% 80% 70% 5% 7,2% 25% 33,2% 60% 50% 40% 43% 41% 60% 50% 40% 39% 30,0% 30% 20% 10% 0% 15% 14% 11% 11% 9% 8% % 20% 10% 0% 13% 12,2% 9% 10,7% 9% 6,7% Planning Specification Design Build Test Implemention Planning Specification Design Build Test Implemention Findings Project phase ratios vs 2017 When comparing the phase percentages between 2006 and 2017, for both new development and enhancement projects, less effort is put into the planning and design phases. For enhancement projects the percentage build effort decreased and the percentage test effort increased, indicating increased focus on testing of added functionality. Lastly specification and implementation percentages have increased in relation to total effort spend in projects.

13 INSIGHT - PROJECT PHASE RATIOS - AGILE New Development Project Phase Ratios - Agile vs Other Enhancement Project Phase Ratios - Agile vs Other 100% 90% 80% 70% 12,3% 8,4% 12,3% 16,8% 100% 90% 80% 70% 6,3% 7,2% 22,6% 33,6% 60% 50% 40% 42,9% 40,8% 60% 50% 40% 31,6% 29,9% 30% 20% 10% 0% 13,6% 14,4% 8,8% 12,0% 10,1% 7,7% Agile Other 30% 20% 10% 0% 30,3% 11,6% 10,9% 4,4% 4,8% 6,8% Agile Other Planning Specification Design Build Test Implemention Planning Specification Design Build Test Implemention Findings Project phase ratios Agile vs other When comparing Agile projects to projects applying other methodologies interesting differences in project phase ratios appear. These differences might be explained by the higher level of automation of repeating/standard activities (mainly in testing and implementation) and the fact that Agile projects have more frequent and regular interaction and validation of results with customers/business.

14 DATA COLLECTION ISBSG IT Confidence 2017, Beijing

15 TRENDS - DATA COLLECTION ISBSG Software development Agile software development is changing the IT industry; Sizing - Low adoption of function point analysis; Estimation - Story points; Planning - Planning poker; Process - Development methodology; Team - Focus on team efficiency; Benchmarking - Impossible due to lack of standardized output measuremnets. More complexity in integration of software in existing environments; More focus on insight in risk of the total application landscape of an organization Complexities; Architetural principles; Quality; Transferability (Cloud readiness) Cost.

16 TRENDS - DATA COLLECTION ISBSG Trends in data collection Less focus on collecting complete historic datasets withing organizations; Less sharing of information within organizations as a basis for company wide improvement (team focus); Increased focus on quality metrics and automated measurement integrated in the development environment; Default settings of tooling rarely used but often tuned to organization specific environments; Comparability of data hard due to lack of overall standards; Less data available based on function points; Automated measurements systems based on lines of code; Automated function points; High volumes of data within Agile development teams (automated quality checks, automated testing, burn down charts, estimates, planning, functional desciptions, team composition) Hard to use this data as input for improvements on organization level opposed to team optimization;

17 CHALLENGES - DATA COLLECTION ISBSG Challenges ISBSG Developing of a repository for code quality metrics; Many propriatary datasets which are not shared outside (for profit) organizations; Agile and DevOps datacollection with or without function point analysis; Lack of standards like NESMA, IFPUG, COSMIC,etc. How to start collecting data from automated processes like: Quality control,systems; Automated testing; Issue & project tracking sytems. Challenges Organizations Management topics, required insight in: Team compositions (HR); Team size; Planning functionality on time; Risks (quality); Productivity; Cost/budget; Supplier control (outsourced services). Still relevant questions in modern software development environments (Agile). We still need metrics for decision making on organization/management level.

18 THANK YOU ISBSG: METRI: ISBSG: METRI:

19 APPENDIX IT Confidence 2017, Beijing

20 ISBSG - DATA COLLECTION Help us to collect data ISBSG is always looking for new data. In return for your data submission, you receive a report that shows the performance in your project or contract against relevant industry peers; Please submit your data through one of the forms listed on: Alternatively submit you raw data for processing (For example Excel format).