Independent Quality Assurance Business transformation programme Inland Revenue

Similar documents
Inland Revenue. Independent Quality and Technical Assurance Business Transformation Programme IQA 7/TQA 6 Final version. May kpmg.

Transformation status update: September 2018

Security level: In confidence Report number: IR2018/570

Transformation Programme. Sourcing Workstream PROCUREMENT PLAN. Inland Revenue. 28/06/2013 v1.0 FINAL Reference:

Initiation Report Inland Revenue Transformation Plan project

The Ministry. Project XYZ. Assurance Plan V.02 July Document Approval. Approved by Senior Responsible Officer

Programme Business Case Addendum

tit Inland Revenue Te Tan i Taake Inland Revenue report: Update on Inland Revenue's approach to business and systems transformation

Seven Consulting Benefits Management Capability Overview

WHITE PAPER JUNE Running a Successful Global Payroll Implementation

Status update for joint Ministers April 2017

B.29[18b] Inland Revenue Department: Procurement for the Business Transformation programme

Sheffield City Council Strategic Partnership Transition Case Study

Continuously improve your chances for project success

Business Case and Proposal

Government Services Division. eservices High Level Implementation Plan

Job Description. Background. Date: April No. of reports: Nil. Delegated Financial Authority: (If applicable)

Topic for today: From tactical to strategic the journey of a PMO

Customer Support Group (CSG) Invoicing and Monitoring Arrangements. April 2016

Inland Revenue report: Business Transformation programme: Status update for joint Ministers as at 9 April 2018

UoD IT Job Description

Background At a meeting in Brussels between the e-cf project and the SFIA Foundation it was suggested that there should be some review of comparison.

Released under NSW GIPA Act GIPA Application TRA Page 1

Foreword. We would like to thank the dedicated organisations who participated in this work. On behalf of the Get It Right Initiative

FIRECONTROL PROJECT. Transition Governance, Roles & Responsibilities. Release: Version 1.0. Release Date: DRAFT

Audit of Public Participation and Consultation Activities. The Audit and Evaluation Branch

REPORT TO THE BOARD OF GOVERNORS

CQC Strategic and High level Risk Register

OUTPUT CLASS 1 POLICY ADVICE OUTPUT DELIVERY TARGETS

VACANCY ANNOUNCEMENT

Information is available in large print, audio and Braille formats. Text Relay service number 18001

Position Description

POSITION DESCRIPTION

The journey to day one

TO MEMBERS OF THE FINANCE AND CAPITAL STRATEGIES COMMITTEE: DISCUSSION ITEM EXECUTIVE SUMMARY BACKGROUND

Role Description Director, Finance and Program Management

This is all echoed in our Māori identity Hikina Whakatutuki which broadly means lifting to make successful.

ESSEX POLICE, FIRE AND CRIME COMMISSIONER, FIRE AND RESCUE AUTHORITY

Paper 3 of 4: ICT Functional Leadership. Proposal. Executive Summary. Office of the Minister of Internal Affairs

The Integrated Support and Assurance Process (ISAP): detailed guidance on assuring novel and complex contracts

OFFICIAL. Scottish Police Authority Board. Meeting. Case: Paper by Interim Chief Officer and Accountable Officer Reference B 08.

Manager Ministerial Resourcing

Better Business Cases. Guidance on Using the Five Case Model: An Overview

A PRACTICAL GUIDE TO: S.M.A.R.T.E.R. SUPPLIER MANAGEMENT. [ T y p e t h e c o m p a n y a d d r e s s ]

Internet Protocol version 6 (IPv6) Transition. Presented to the Federal IPv6 Inter-Agency Conference

The purpose of this paper is to present the outcome of recent PR19 assurance activities for note by the Water Forum.

IFMS Scope: Modules SCM. Financial. Management (GL) Inventory. Management. Asset. Payroll. Management HRM. Procurement Management Catalogue Management

Transformation Briefing for the Minister of Revenue. November 2017

Supply Chain Specialist

October 2014 FC 156/15. Hundred and Fifty-sixth Session. Rome, 3-7 November Progress Report on an Accountability and Internal Control Framework

NOT PROTECTIVELY MARKED. HM Inspectorate of Constabulary in Scotland. Inspection Framework. Version 1.0 (September 2014)

Manager VIP Transport

FURTHER INFORMATION SENIOR CIVIL SERVICE CHIEF EXECUTIVE DEPUTY DIRECTOR SCS 1 SCOTTISH FISCAL COMMISSION EDINBURGH/GLASGOW

Director Procurement & Value Delivery

James Cook University. Internal Audit Protocol

Programme & Project Planning and Execution

ISACA Systems Implementation Assurance February 2009

This is all echoed in our Māori identity Hikina Whakatutuki which broadly means lifting to make successful.

Lessons learned: setting up for success

NHSBT Board January Core Systems Modernisation (CSM) Programme Progress Update

[SECURITY CLASSIFICATION]

Audit Committee Annual Report. Director of Corporate Governance. Stage 1 only (no negative impacts identified)

Policy Skills Framework. Development, insights and application

Brief for the position of Interim Programme Manager National Services Scotland (NSS) Foreword from James Hall Director of IT Operations...

Reports to: Achievement Simplicity Integrity We are focussed on creating positive outcomes for each other

System Under Development Review of the Departmental Financial Management System Renewal Project

Public Social Partnerships: Lessons Learned

AUDIT COMMITTEE REPORT

UNCLASSIFIED. We are seeking applicants for a Lead Negotiator in the Trade and Economic group.

Significant Service Contracts Framework

ARCHIVED - Evaluation Function in the Government of. Archived Content. Centre of Excellence for Evaluation Treasury Board of Canada Secretariat

Business Integration :

Ward Affected: All Wards ITEM 15 CABINET 9 NOVEMBER 2015

City of Saskatoon Business Continuity Internal Audit Report

11" Transformation Programme. tlt Inland Revenue Te Tan Taake. An Overview of Inland Revenue's Transformation. Andrew Bardsley 30th March 2012

Core Systems Modernisation (CSM) Programme Progress Update

2012/13. Ministry of Education Output Plan. between. the Minister for Tertiary Education, Skills and Employment. and. the Secretary for Education

MINISTRY OF EDUCATION VOTE EDUCATION MINISTER FOR TERTIARY EDUCATION. 2010/11 Monitoring Report SECOND REPORT FOR THE YEAR ENDED 30 JUNE 2011

Final Audit Report. IT Client Services - Application Development and Maintenance Process

EPSAS Conversion. Major Program Transformation (MPT) Date: May 2015

ROLE PROFILE FOR PROJECT MANAGER (INTERMEDIATE)

Procurement Functional Leadership Progress Report October 2015 to March 2016

QUEENSLAND URBAN UTILITIES Position Description

2017 Four Year Plan Guide

High-level business case template and guidance. Corporate services productivity programme

JOB DESCRIPTION SALARY: 36,004

Trusted KYC Data Sharing Framework Implementation

Manager Service Performance & Integration

PHASE TWO FOLLOW-UP REPORT ON THE AUDIT OF CONTRACTS (2008)

OFFICIAL. Date 18 April 2018 Pacific Quay, Glasgow General Data Protection Regulation (GDPR) Police Scotland Preparedness Item Number 11.

Our Healthier South East London (OHSEL) The SEL STP. Programme Director: Community Based Care

Audit Report. Audit of Spectrum Application Modernization - Commercial Software Implementation

Operations Analyst Kaitātari Whakahaere Policy Group, Policy, Regulatory and Ethnic Affairs Branch

Enterprise Data Strategy and Governance

Our vision is: New Zealand values the wellbeing of tamariki above all else.

DORSET PROCUREMENT. Procurement Strategy

Skills for Security Ltd ASSESSMENT STRATEGY. for S/NVQ s using. National Occupational Standards

WHITE PAPER Business Process Services: A Successful Transition Is the Foundation of World- Class Outsourcing cgi.com

Does poor project governance cause delays?

JOB DETAILS. Chief Executive Officer Office of the Chief Executive Officer Office of the Chief Executive Officer NEMISA Board 15

Transcription:

Independent Quality Assurance Business transformation programme Inland Revenue Final version 2.0 December 2016 kpmg.com/nz

1 Executive summary This report presents the findings from the KPMG Independent Quality Assurance Review 5 (IQA5) and Technical Quality Assurance Review 4 (TQA4) of the Inland Revenue Department Business Transformation Programme (BT Programme). The objective of this review is to provide independent assurance to the Programme Sponsor, taking into account the perspectives of relevant stakeholders including the Programme Director, Transformation and the Manager Internal Assurance and Advice, and external monitoring agencies. 1.1 Background to our review Inland Revenue (IR) has been working to meet the Government s expectations of a Public Sector that delivers smarter, modern services for less, and where applicable, is aligned to the Government s Strategy and Action Plan and to meet IR s own strategic ambition of being a Government organisation that delivers world class services to its customers and stakeholders. During 2011 IR embarked on a journey to reduce the risk of its aging revenue system by making the case for substantial re-investment in the organisation and its ICT systems: the Business Transformation Programme (BT). During 2012 and 2013 the Programme Business Case was developed that identified a ten year period of transformation for the IR operating environment, the way IR engages with the New Zealand public, and supporting ICT infrastructure and applications. KPMG provided independent quality assurance on the Programme Business Case in February 2013, and again in August of that year. Following the presentation of the Programme Business Case to Treasury, the Minister and Cabinet, the Transformation Programme entered the Mobilisation Phase in December 2013. The Programme Business Case was approved by Ministers in the first quarter of 2014, and a Cabinet paper that sought drawdown of funding was approved in April 2014 for the Programme through to the beginning of January 2015; the Pre-Design phase. During Mobilisation and Pre-Design Preparation phases of the Programme Accenture were selected as the Programme partner for the High Level and Detailed Design phases. During the High Level Design phase a procurement process was undertaken to select a Commercial off the Shelf (COTS) solution to provide core tax processing capability. FAST Enterprises were selected as the vendor and have been working with the IRD since August 2015 to develop the detailed design for Stage 1: Goods and Services Tax, together with planning activity for subsequent stages. The Programme has now largely completed the software build and configuration, and is in the testing phase, prior to go-live planned for early 2017. Management are targeting a go-live in February 2017 (the green date ) though we note that the publicly declared date is April 2017 (the red date ). To support the delivery of the Programme, IR has engaged KPMG to provide Independent Quality Assurance (IQA) and Technical Quality Assurance (TQA) across the Programme. This review follows prior KPMG reviews in March 2014 (IQA1), June 2014 (addendum to IQA1), January 2015 (IQA2, TQA1), June 2015 (IQA3, TQA2), March 2016 (IQA4, TQA3) and August 2016 (IQA 4.2). The review also recognises that a number of other independent assurance reviews have been planned, as part of the Programme quality assurance plan. This review has leveraged the scope of other recent reviews as appropriate and agreed with the Programme. In this report, we have also made reference, as appropriate, to some prior recommendations which are referenced in the format IQA 4/5 ; this format was adopted in our prior reports, and should be read as recommendation 5 from IQA4. 1

Our detailed approach and scope are set out at Appendices 1 and 2 (sections 5.1 and 5.2). We have also assessed the Programme s progress on recommendations made under those prior reviews. 1.2 Summary terms of reference The terms of reference for this review were confirmed in the Statement of Work MIT-00000060-AGR-004 agreed between IR and KPMG in October 2016. Our terms of reference address both Independent Quality Assurance (IQA) and Technical Quality Assurance (TQA) to provide stakeholders with an assessment of the status of the Programme addressing governance, alignment to other programmes and organisations and technical delivery. Our review was undertaken in November 2016 as the programme transitioned from Business Systems testing into Scaled Business Simulation, and some three months prior to the planned go-live date for Stage 1 implementation. We have reviewed documents and artefacts that were current at that time; consequently, a number of these artefacts were in draft and work-in-progress, (and our agreement with the Programme is that we will not review draft artefacts in detail), and may have been amended since. 1.3 Overview of our assessment We have set out below the key findings from our report. The comprehensive findings, and our recommendations, are set out in sections 2 4 below. 1.3.1 Programme Readiness The Programme, at present, is forecasting that it is On-track for its green go-live, in accordance with the published timescales. However we note that there is some ambiguity and flexibility in these dates, as the Programme acknowledges red (public) and green (aspirational) dates for the go-live. Our assessment is that the Programme presently is some way behind schedule (i.e. the progress through the Implementation stage has, in general, been slower than planned). In our view, there is significant risk to the achievement of the planned go-live date (the green date) and some residual risk to the red date remains present. The risk arises, primarily, in respect of the testing activity (where conclusion of Business System testing (BST) has slipped significantly, preparation for non-functional testing is behind expectations, and deployment activities have not been specified to the degree that we would expect, at this stage; that said, in other respects (for example, integration of Heritage systems, and execution of data-migration) the Programme has, through the Implementation phase thus far, successfully mitigated many of the more significant risks that were apparent in our prior reviews. With respect to the planned activity between now and the go-live date, we believe that the Business Readiness Assessment ( BRA ) framework, and the process of conducting rehearsals, represent good practice, and provide a comprehensive framework to track the readiness as it increases over time. However, the complexity & scale of the solution mean that the go-live decision will require a carefully considered judgement call, balancing the needs of the programme to maintain progress with the demands of the stakeholder community. We note also that some decisions related to go-live, such as commencing detailed staff training and customer communications have to be made relatively soon (i.e. early mid December) if the green date is to be achieved. The Programme, together with the Governance bodies, will need to track the results from the upcoming BRAs (and associated risk-profile) diligently; they also need to guard against group think (noting that different stakeholders will represent different views about different aspects of the BRA results), and be very aware of any changes to the risk-profile, and how that is traded-off against the go-live date. This is particularly important as the benefits in terms of solution quality of a delay from the green date (6 February) to later in the go-live window such as the red date have to be set off against the need to maintain progress on the subsequent stages. 2

1.3.2 Organisation Readiness With respect to the business readiness activities, we noted continued strong evidence of effective communications and change management from the Programme together with commitment from the business to the Programme objectives. The communications and training activities have been well planned, and are, as far as is possible, resilient given that they are not able to commit to the final go-live date at present, and the logistics for end-user training have to accommodate the Christmas holiday. We also noted evidence of healthy medium to long term planning and focus, in addition to the Stage 1 go-live. 1.3.3 Programme Governance In our assessment, the Governance of the Programme continues, in general, to be effective, and we received consistent feedback that the Governance process (particularly the Programme Leadership Team (PLT) and the Programme Governance Board (PGB) forums) have become more effective, with a reduced documentation and process workload. This has allowed an improved, and necessary, focus on open communication and timely decision-making. Since our prior full review (IQA4 in February 2016), the Programme senior leadership has altered, following the need to replace the incumbent Programme Director, who left the Programme in September. At an operational level the transition appears to have been effective, with the Deputy Commissioner, Change, stepping into the Programme Director role in an interim capacity. Subsequently, during the course of our fieldwork it was announced that the Deputy Commissioner, Change, would formally take accountability and responsibility as the Programme Director, with the Commissioner assuming overall accountability as the Senior Responsible Owner (SRO). This is, in our opinion, appropriate and provides for continuity in the Programme Director role. We concur that to do anything else in the short term such as recruiting an external Programme Director would be impractical at this point, and would certainly increase the risk to Stage1 delivery. We are, though, of the view that as the programme scales up to deliver Stages 2 and 3 additional senior level programme delivery resource will be required. This is due in part to the increasing size and complexity of the programme, but also to ensure that the risk profile is acceptable in terms of single point of failure which currently resides with the Programme Director. 1.3.4 Project management processes & controls We reported, in prior phases, that the Programme Management practices in the Programme had been under increasing stress, and this had been acknowledged by the Programme, which had planned remedial actions. Similarly, the capacity and capability in the Programme Management Office (PMO) was acknowledged as requiring attention. It is our assessment that, through the implementation phase, the Programme has not achieved the intended and necessary uplift in these areas, and consequently some aspects of the Programme Management processes have not been conducted to the level that we would expect for a Programme of this scale, complexity and importance. In particular, the capacity and capability of the PMO to provide the necessary management support to the Programme, and the associated supporting infrastructure and toolset, have not been adequate. Furthermore, the scheduling, reporting and dependency management capability has been variable across the Programme workstreams, and has not been sufficiently integrated at the Programme level. As a result the Programme does not have the degree of quality and precision in its schedule, progress reporting, and forecasting that we would typically expect to see in a Programme of this scale. The Programme has been aware of these issues, to a large extent, and some remediation has been possible. Specifically, there has been a scheduling overlay applied for the Stage 1 go-live activity to provide an integrated view of the workstream schedules, and with regard to the PMO, Accenture have recently been engaged to supplement 3

and develop the available tools, capability and support, although this work is not fully-defined at this point. We note, though, that the benefits of this work will only begin to be felt in Stage 2. 1.3.5 Testing We previously reported (IQA4) that the primary Programme reference for testing (the Testing Strategy and Plan) provided a sound baseline for the subsequent test activities, though with some degree of complexity apparent to accommodate different suppliers preferences. It is apparent, however, that in the execution of the testing to-date, the processes that were defined have not been followed with the rigour that we would expect to see (e.g. the disciplines and controls related to test entry/exit criteria have not been consistently applied), the testing documentation is not fully coherent, and there are other material variances (e.g. the approach to User Acceptance Testing). It is apparent, also, that some of the roles and accountabilities (e.g. the UAT test manager, the BT Programme test manager) have not been operationalised as defined. With respect to test execution, the progress on BST has been significantly slower than intended, and it is apparent that the original plan under-estimated the BST timescale, particularly with respect to the inter-dependencies that were to be managed, and the inherent complexity in the IR systems environment. The risk to the planned timescale was recognised in May, but not adequately mitigated, although the Programme has sought to reduce the impact of these issues by supplementing headcount in areas that were under pressure, using both IRD staff and external suppliers. 1.3.6 Methodology In our prior reviews we have made a number of observations and recommendations with respect to the Programme methodology, and specifically the ability to take advantage of FAST s preferred iterative methodology (and associated approach), and integrate it effectively with the BTM methodology, and with other suppliers. While in prior phases these have largely reflected risks to the Programme, a number of issues (and adverse consequences) have arisen as a result of this underlying difference since our IQA4 review. These manifest in a number of areas, including testing, and project-management processes and controls (see above), and there is an understanding that the roles and accountabilities for leading activities, as they have been defined thus far, have not always best suited the strengths of the participants, given the scale and complexity of this Programme. We understand that these lessons are being considered in the planning for Stage 2, which is encouraging, but in our assessment there remains a significant residual risk in Stage 1, for example with respect to the quality of planning, accountability and timescales for SBS testing, non-functional testing, deployment and cutover. 1.3.7 Stage 2 This review has been focussed on Stage 1 readiness, but we have noted that the planning for Stage 2 deployment has been initiated in parallel with Stage 1. This is encouraging, as is the feedback that this planning is leveraging the lessons-learned from Stage 1. 4