Software Quality & Productivity. Berlin, June 2013

Size: px
Start display at page:

Download "Software Quality & Productivity. Berlin, June 2013"

Transcription

1 Software Quality & Productivity Berlin, June 2013

2 Topics Software Quality Improvement Productivity 2

3 Capgemini In Figures Revenue 2012: 10,264 million Operating margin : 787 million Operating profit : 601 million Profit for the year attributable to shareholders : 370 million Net cash and cash equivalents : 872 million Revenue by business Cap Gemini S.A. is a member of the CAC40, listed in Paris ISIN code: FR Note: Our brand name is Capgemini but the name of our share on the stock exchange is Cap Gemini S.A. Revenue by industry Local Professional Services 14.9% 4.9% Consulting Services Customer products, retail, distribution & transportation 14.1% 11.3% Energy, Utilities & Chemicals 40.4% Manufacturing 17.8% 19.8% Financial Services Outsourcing Services 39.8% Technology Services Telecom, Media & Entertainment 9.1% 23.2% 4.7% Other Public Sector 3

4 A Global Delivery Network: 44 countries and 100 languages (As of December 31, 2012) North America 9,609 UK & Ireland 8,964 Canada France 21,110 Benelux 9,186 Nordic Countries 4,504 Germany & Central Europe 9,581 United States Mexico Guatemala Morocco All over Europe United Arab Emirates India Japan People s Republic of China Taiwan Vietnam Philippines Colombia Malaysia Brazil Singapore Chile Argentina South Africa Group workforce 125,110 Working offshore 50,425 Australia New Zealand Latin America 9,399 Morocco 628 Middle East & Africa 26 Italy 2,524 Iberia 4,812 India 41,019 Asia Pacific 3,748 4

5 Quality Improvement Programme

6 Why is this important? Top 3 Reasons Why Clients are Undertaking Sourcing Transactions Basic Unit-level errors account for 92% of the total errors in the source code but these count for only 10% of the defects in production. Improve quality Save costs Establish variables cost structure 47% 84% 74% Bad coding practices at the system-level count for only 8% lead to 90% of the serious reliability, security, and efficiency issues in production. 11,12 Focus on core/ strategic areas Access resources/ skills/ expertise Avoid capital expenditure Increased delivery speed 5% 11% 47% 32% ISG (TPI) 2012 Meanwhile, tracking the system-level programming errors could save more than half of the rework during the building phases, while drastically decreasing the production incident rate OMG/SEI 6

7 Why Automate Structural Quality Control Software «Blood Test» Estimation & Productivity Systematic use in software quality gates leading to a reduction in re-work effort Enables middle management to test the quality of software developed «Take care of the cows and not the cowboys» 7

8 Capgemini CAST Background 2008 Corporate Relationship Established 2010 Morocco AIC Integrated with Delivery Multiyear Enterprisewide License Agreement. Mumbai AIC Integrated with Delivery. Rapid Portfolio Analysis Services incorporated. Capgemini has the first Certified AIC outside CAST. 8

9 Improving Structural Quality Upfront Feeding of Common violations, Rules and Engineering Best Practice Common Violations ++ 9

10 Example Violations COBOL, Java Technology Cobol Cobol Cobol Cobol Cobol Cobol Cobol Cobol Cobol Cobol Rule Summary Paragraph naming convention - prefix control Section naming convention - prefix control Program naming convention - prefix control When using binary data items (COMP), then use the SYNCHRONIZED clause Never truncate data in MOVE statements Avoid using Sections in the PROCEDURE DIVISION (use Paragraphs only) IF statements must be closed by END-IF Sections and paragraphs should be located after the first statement calling them Avoid using GOTO statement Avoid large Programs - too many Sections Technology Java Java Java Java Java Java Java Java Java Java Rule Summary Avoid public/protected setter for the generated identifier field Avoid unsecured EJB remote method Avoid static field of type collection All page files should be in a specific directory Use only Struts HTTP Servlet All script files should be in a specific directory All stylesheet files should be in a specific directory Persistent classes should Implement hashcode() and equals() Persistent class method's equals() and hashcode() must access its fields through getter methods Provide a private default constructor for utility classes 10

11 Deployment Approach BU CAST Program Owners Responsibilities: 1. Work with the account teams to identify and prioritize CAST demand 2. BU demand plan submission to AIC and IDP 3. Funnel Apps Code demand into AIC Apps Code AIC CAST Shared Services Responsibilities: Runs the code, answers questions, provides results CAST Outputs Demand Client Engagement Team Responsibilities: 1. Trained on CAST dashboards/reports 2. Receives the action plan 3. Iterates on the action plan with the SMEs to further refine and prioritize 4. Deploys the actions plan 5. Accountable to ensure expected/planned results are achieved 6. Works with BU CAST Program Owners to prioritize CAST requirements for their accounts Action Plan CAST Application SMEs Responsibilities: 1. Interpret the results 2. Prepare the action plan recommendations 3. Trains the account team Key Users on the CAST dashboard/reports Supported and Advised by CAST project manager and technical team 11 11

12 Shared Services Deployment was a Critical Success Factor A Shared Service is a dedicated team which proposes standardized services to projects on specific lifecycle activity Project 1 Service Catalog A specific Service catalog is defined for each Shared Service Project 2 Shared Service Request Workflow No Shore team Support is delivered by a no shore team Project 3 Request workflow is implemented in a ticketing tool with the same template for each service 12

13 What has been achieved? 5000 FTE covered Built into standard CMMi Quality Audit approach for CSD in 2013 >400 applications analysed CAST is included in all large scale AM bids now >65 million lines of code assessed Shared Services AIC Established in Morocco and India 13

14 Case Studies Query Time improvement Ticket Volume Reduction On-lIne performance improvement 14

15 Productivity

16 Productivity Principles It must be simple to understand The cost of capturing the necessary data to calculate it should be minimal It should not be open to multiple interpretations The calculation should be automated Ideally, it should be based on an external industry benchmark 16 16

17 AM: Impact of Quality on Productivity Based on a snapshot of MI values and Ticket volumes across a selection of client applications (multiple technologies) 17 17

18 The Key Question is what is the Measure of Output? in Applications Development MANUAL Pre Sales Execution Original Estimate AUTOMATED Close completion using actual client drivers Estimate Variance Vs. Automated Function Point 18

19 The Key Question is what is the Measure of Output? in Applications Maintenance Flow Metrics Vs. Lines of Code 19

20 What could be a Productivity calculation? Productivity (Applications Development) = (Quality Index) x (Automated Function Point/Cost) Productivity (Applications Management) = (Quality Variance between successive releases ) x (Lines of Code Variance/Monthly Cost) 20

21 Productivity Challenges Automated Function Point Calibration Lines of code reliability Cost evaluation hours? financial? non-people? offshore impact? 21