Session 15 PD, Model Governance Moderator: Jason A. Morton, FSA, MAAA Presenters: David R. Beasley, FSA, CERA, MAAA Jason A. Morton, FSA, MAAA Robert P. Stone, FSA, MAAA
Session 15 Panel Discussion: Model Governance Jason Morton, FSA, MAAA
Agenda Scope of Model Governance Jason Morton, FSA Elements and Trends Deloitte Consulting LLP Components of Governance Robert Stone, FSA Get it Modernized Milliman, Inc Change Management David Beasley, FSA Keep it Governed Oliver Wyman Q & A -1-
Actuarial models sit at the heart of any Actuarial function and as a result are at the core of any transformation. Actuarial model design and operation has a significant influence on the effectiveness of the actuarial function and the ultimate success of the organization served by the actuarial function. Insurers need to carefully consider the governance of models and their impact on model cost, model risk, and the ability to make informed decisions on a timely basis. 2 Copyright 2015 Deloitte Development LLC. All rights reserved.
SoA Survey Actuarial Modeling Controls An update to the 2012 research survey is currently underway. PBR readiness is the driver Includes the same questions as the 2012 survey; looking to measure improvement Adding a few more questions given the tools, techniques, and practices that have evolved Timeline: Results due to be published in August Presented at the Valuation Actuary Symposium (& others) -3-
Span of Model Governance Typical Target State Architecture Central Data Store Single point of truth, where all information is stored. Admin / Other Sources Admin 1 Admin 2 Reporting Layer Reports Ledger Trends Dashboards Analytics Data Aggregation Layer Calc Engine Sys 1 Reporting Layer Standardized reports, trends, analytics, & dashboards Finance Data Warehouse Assumption Warehouse Planning Sys 2 Investments Data Load & Validation Layer Sys 3 Reinsurance Other Data transformation, validation and monitoring Data extraction and integration Assumption and Other Inputs Sys 4 Actuarial Models Consolidation is a key trend Data extract, validation, and transformation Landing Zone Common point of entry for other inputs Assumptions, product features, scenarios, sensitivities, etc Ownership split between Production & Development teams -4-
Elements of Leading Practices in Model Governance Yadda Board of Directors Board Audited & of Directors Senior & Management Senior Management Timely Audits and model validations look for support; inputs, product features, methods, calculation accuracy, usage, etc Watch for ELMO especially the L (limitations & simplifications) Infrastructure (e.g., grid/cloud, computing environments) are key to governance while facilitating model efficiency and runtime Policies & Procedures Policies & Procedures Controlled Change management is absolutely crucial; look across the entire lifecycle of a model. Best practice is to Embed and Enforce How to get there? Modernization; but also changes in org structure and operating models It is important to note that Model Design is just as important a poorly designed model is more difficult to govern -5-
Other Trends in Governance Org Structure CDT / COE structure Line of Business Business Requirements Center(s) of Excellence Business Manager LOB Resources Model management committee Model Development Teams Production Process Teams Production vs Maintenance Internal Audit hiring more actuaries Actuarial Data/IT Model Risk Management Working more closely with IT Computing environments, locking down production with formal change management System Administrator role Use of enterprise tools, especially for data handling / ETL and process scripting Automation For valuation in particular, to the point of being largely touchless Release schedule for regular updates to the corporate model(s) throughout the year -6-
MODELING ENVIRONMENT
The Modeling Environment A critical mindset for production modeling is to address modeling holistically as a Modeling Environment. The Modeling Environment includes: Governance The governance of all production models, during the development, acceptance, and model execution processes Validation The validation of changes and validation of results Change Management The process for introducing changes into the production model Business Uses & Applications Organization Model Components Technology Platform The business uses and applications that the model supports The organizational environment the end users, the model development team, the model production team, and IT The components of the model code, data, reports, workflows, configurations, etc. The technology environment for scalability, disaster recovery, archival, etc.
Scope of the Modeling Environment Model Development Model Execution Define Requirements Implement Test Refresh Execute Validate Model change and improvement Production of information
Development Environment Model Development Define Release Content Build and Test Configure Integration Testing Acceptance Testing End users submit requirements Modeling team provides high level estimate of effort Modeling Team defines and documents the specifications Actuarial and Data Developers build and unit test Testers provide documentation of test results Modeling Team configures the model for the new features Modeling Team tests the updated model, data, and assumptions and documents the results of the testing. Modeling Team executes and documents the waterfall analysis for users to accept changes Change Board Agrees Release Design Authority approves the specifications Design Authority approves the test results Design Authority approves the test results Steering Group Ratifies Design Authority approves the unit test results Steering Group Approves Promotion to Production
Execution Environment Model Execution Off Cycle Analysis Automated Cyclical Execution Refresh Execute Validate Out of cycle testing of parameters Assumption updates: Experience studies team defines, tests, and documents new assumptions for use in cyclical execution Ad-hoc analysis Monitoring of the process, including data delivery, infrastructure availability, errors and warnings Top line review to confirm ready for delivery to end users Line 1 validation reports Detailed actuarial review by end user Acceptance of results Requests for additional analysis to support validation Assumptions Committee approves updates During the production process, decisions will be made regarding issues. Emergency patch procedure for critical incidents Manual adjustments for non-critical items The escalation and approval processes will need to be defined.
GOVERNANCE CONTROL
Model Governance Committees and Oversight Model Steering Committee Final authority over the production model. Assures resources are in place to deliver on business needs. Change Board Responsible for governing changes to the production model including prioritization, timing, and release scope. System Design Authority Ensure features are implemented and developed with appropriate quality and rigor, considering all aspects of implementation and the business user requirements.
Model Controls The where/how of model updates and model use Regulates how model change takes place Regulates and protects the environment in which the corporate model exists Helps ensure unwanted changes don t make it into main corporate model Includes what users can make what types of changes in the model Also incorporates processes followed in making model changes Understanding what end-users are allowed/expected to do Controls might look very different depending on what software is being used
Model Governance and Change Controls Governance and Control Processes Smooth model updates require repeatable processes Goal should be to update the model the same way every time (as long as nothing structurally has changed in the model) Automation untouched, production environment Same formats for seriatim assets and liabilities Use same code/process to create inforce files Same sources for updating assumptions Same information for validating updated runs Control changes to all parts of process Measured approach follow a schedule Identify impacts on model results before release Validation! Slightly more freedom for product development models, but need agreed upon process for integrating with corporate model
Model Governance and Change Controls Separation of model development vs. model execution is a must No tinkering This applies regardless of centralized/decentralized Specialize around operations vs. analytics Those that are part of the model development team, or work with/report to the model steward are operations. The end users are analytics. Operations can be decentralized, but companies will typically evolve towards centralizing the operations if how a model is developed, maintained, and executed is shared process. Externalizing data to tables is fundamental and critical. Then organizing table data becomes focus.
Model Governance and Change Controls Centralize/delegate responsibilities when possible One source for inforce files and populations Model steward process for controlling database Database/model release schedule One source for scenario sets and/or assets Use of network libraries for items mentioned plus table files Monthly coordination meetings with user base representatives
MODEL INDUSTRIALIZATION
Components of Industrialization Governance / model risk Change mindset from delivering projects to a well defined process Separation of model change and model execution Very rigorous change process around all aspects of model change Automation of manual tasks Mindset change the models are right Test to destruction Deliver results with clear attribution analysis
Six Steps to Industrial Modeling Step 1 rationalize, harmonize, and synchronize core actuarial models Step 2 address calculation capacity and support Step 3 control and collaborate on business logic and models Prepare for Production steps Computing: grid/cloud? Model Development Solution Step 6 organize the actuarial team to align with new processes and technology Step 5 control assumptions, and orchestrate periodic processes Step 4 control data input and output Target Operating Model Automated Model Projections Data Management Solution
Preparing for Production Consistent, Current and Orchestrated, Understood Rationalize Consolidate model files as much as reasonable Synchronize Align with latest/greatest model version Leverage product and regulatory updates Organize Orchestrate end-to-end projection / jobstep workflow Define consistent run schedules across applications Simplify Review and document customizations. Re-implement where needed. Simplify logic and approach to reflect current usage. Remove orphaned logic.
CHANGE MANAGEMENT SESSION 15 MODEL GOVERNANCE MAY 16, 2016 David Beasley, FSA, MAAA, CERA 2016 Life & Annuity Symposium Nashville, TN Oliver Wyman
Model Owner executes the change management process ensuring strong model governance Model Owner duties Production runs and results Work with Business Manager to plan model changes Work with LOB Resources to design changes Request enhancements from the Model Development Team Review enhancements Perform testing Integrate model changes into production environment Drive model change approval process The Model Owner should be responsible for production models Empower this individual to execute model changes while satisfying model governance standards Require this individual to adhere to model governance standards Oliver Wyman 23
Monitoring results Actuarial analytics that are efficient, repeatable, and tailored to the product and modeling application Margin analysis: Earned rate, Credited rate, COI rate, Mortality rate, etc. Key metrics: Reserve / AV, Per policy expense, Implied surrender rate Dashboards customized to management s needs Awareness of trends: Informing management of trends in inforce experience and the economic environment Understanding financials and attributing performance: Linking earnings, balance sheet and value changes to inforce experience and the economic environment, i.e. crafting the story Planning: Providing a view on future financials and potential drivers of performance What if analyses: Informing management on the impact of certain actions (e.g., rate setting, buy-back programs, sale levels/mix, M&A) War games: Interacting with management by crafting scenarios and potential responses (e.g., financial crisis replay ) Oliver Wyman 24
Growing demands and competing challenges Regulatory and reporting changes Principle-based approaches for statutory reserves and capital ORSA SIFI requirements Required reporting for captives FASB/IASB Insurance accounting updates European and Canadian solvency requirements applying to US subs Increasing stakeholder needs Internal and external stakeholders with zero tolerances for mistakes Rating agencies requesting additional disclosures and evidence of strong governance Regulators and rating agencies asking for alignment of risk management with reporting and management use test Continuous innovation needed to meet emerging demands and challenges Increasing management demands Improving planning and forecasting Developing richer real time information while maintaining accuracy Changing internal management information metrics Cost saving and efficiency focus Stressed resources and personnel Staff is spending too much time producing results and not enough on analysis and communication Result production is labor intensive and little time is spent on documentation and process improvement Oliver Wyman 25
What could go wrong? Examples of risk sources and potential mitigation steps Potential Error Description Potential Mitigation Steps 1 Model design errors 2 Input errors Model change does not comply with or meet business requirements Source data, assumptions, or parameters input incorrectly Data misunderstood by model Clear requirements Gap analysis/review design Acceptance testing Review, validation, and testing 3 Incorrect specifications Model specifications incorrect or misunderstood Clear specifications Review process 4 Incorrect model coding Model coding incorrect due to misunderstanding of specifications or requirements Review, validation, and testing 5 Data errors Errors in source data or assumptions Validation of data 6 7 Misunderstanding of model output Inappropriate approximations Misunderstanding or misinterpretation of model output Clear requirements and documentation Approximations create invalid or less useful output Validation and testing of results 8 Controls Controls not well designed or not well executed Clear documentation, peer review of modeling process and controls Oliver Wyman 26
Annual planning and prioritization Prioritize model issues and analytics/reporting shortcomings to remediate and plan the remediation effort Errors Control issues Estimated impact on key metric Include expected assumption changes or model releases Remediation work plan Desired go-live dates 1 Title Description Type Poor static validation of statutory reserves (110%) Initial modeled statutory reserves are 110% of the actual statutory reserves Control issue None Estimated impact on key metric 2 2014 product mapped to 2011 product The new product released in 2014 is being mapped to the previous generation, 2011 product Face amount inforce as of EOY 2014 was $100M Simplification $0.5 3 Partial withdrawals are not modeled Assume that no policyholders take partial withdrawals Total PWs for 2014 were $25M Simplification $0.5 Oliver Wyman 27
Research items and detailed project planning Cost / Benefit analysis Document goal of model change, i.e. benefit Identify Development Team, Modeling Team, and LOB Resources needed Document proposed testing and analysis approach Deliver a detailed description of issues that are to be addressed and specific modeling changes needed to address each Model issue Proposed changes Effort to change Expected impact on validation Expected impact on key metric Poor static validation of statutory reserves (110%) Update interest rate table Modeling Team: 8 hours +3% None Fix guaranteed charges on Product X Improve AG38 coding Modeling Team: 16 hours -8% +$5.0 Development Team: 80 hours Modeling Team: 40 hours -3% None Confirm decision to proceed with Business Manager Alert centralized change management team of revised plan Oliver Wyman 28
While developing changes, what types of analytics and testing can we do? Technique Description 1 Static validation Confirm model coverage and compression 2 Dynamic validation Identify disconnects between actual results and model projections 3 Control totals Check totals for key values and model logic flows at all hand-off points in the process 4 Key ratios and checks E.g., reserves per unit, statutory-to-gaap reserves, claims-to-premium 5 Rollforwards Steps explaining the projected or actual change in balances (e.g., account value, DAC) with the goal of confirming the reasonableness of each step 6 Attribution analysis Analysis to explain complex movements in assets and liabilities 7 Sources of earnings Identify drivers of profits/losses 8 Regression testing Confirm model changes do not have unintended impacts 9 Parallel testing Testing the calculations through use of an independent model 10 Extreme value testing 11 Sensitivity testing Check that the model is performing as intended when invalid data or extreme (boundary) data values are used Custom sensitivities to gauge the reasonableness of the model and assist with understanding and forecasting results Oliver Wyman 29
Static validation / dynamic validation Static validation Dynamic validation Checks that reconcile policy, contract, and asset metrics from the model to the data extract: Counts Face amounts Account values Death benefit Cash surrender values Policy loan balance Statutory and tax reserve First level check identifies missed policies or whole blocks of plans Second level check identifies data adjustments or approximations made Compare multiple projection years (usually less than 5 years) of projected results to historical reported values Performing dynamic validation of cash flows, balance sheet and income statement will reveal potential issues, but a drill down is often needed to understand the cause (e.g., assumptions) Backcasting over the historical period provides additional information if possible Getting the information necessary out of model projections is the easier part; often the historical information is not readily available at the right level Dynamic validations can be automated as part of the reporting process but more granular validations are usually also performed as part of a formal validation exercise Oliver Wyman 30
Regression testing / attribution analysis / sensitivity testing Regression testing Attribution analysis Sensitivity testing Model validation practice of testing new model version against previous version to ensure unintended outcomes were not introduced from model changes First level comparison of model cells provides high level regression test Second level comparison should include a robust test bed to capture all model point permutations Attribute changes between models (e.g., version updates), or between model runs (e.g., valuation dates) Generally, complex models have multiple components to attribute; analysis should be able to quantify impact of individual components and interaction of components Evaluation of the sensitivity of model results to changes in the model inputs (e.g., assumptions, parameters, data) May be used as a benchmark for attribution analysis Oliver Wyman 31
Documentation How can documentation be a tool to manage model risk? Change Documentation Controls Business need addressed Model changes Inforce data Assumptions with support Product features Methodology, including simplifications Testing performed and sign-off Peer review Business manger approval Change management team relies on the Model Owner s documentation to answer the following: Was the business need addressed? Are there residual issues? Are those being tracked? Was appropriate, agreed upon testing performed? Did it demonstrate the desired results? Was an appropriate peer review performed? Is the model still fit for purpose? Oliver Wyman 32
Integrating the model changes Usually a tighter time frame, but can t sacrifice controls and governance now Determine ordering of steps Review impacts against results from testing Leverage efficient, repeatable actuarial analytics Plan for issues and contingencies in advance Title Model components affected Testing impact on key metric Integrated impact on key metric 1 Update interest rate table Reserve inputs None None 2 Fix guaranteed charges on Product X Product feature inputs +$4.1 $4.1 3 2014 product mapped to 2011 product Revised inforce file Product feature inputs Assumptions Model coding Actuarial analytics Dashboard +$10.0 +$8.5 Oliver Wyman 33
Integrating the model changes Documentation Static: Change documentation Dynamic: Model documentation Business need addressed Model changes Inforce data Assumptions with support Product features Methodology, including simplifications Testing performed and sign-off Peer review Business manger approval Purpose Scope Process Limitations Simplifications Inputs Inforce data Product features Assumptions Outputs Supporting documents