SOLVENCY II SCORING - model validation: probability distribution forecast & risk ranking (SQS)

Similar documents
Governance, Risk Management & USE Workshop. 1 & 3 March 2011

Internal Audit s role within Solvency II. 14 May 2010

EBA/CP/2015/ December Consultation Paper. Guidelines on ICAAP and ILAAP information collected for SREP purposes

EIOPA Common Application Package For Internal Models: Explanatory Note

Internal Model Industry Forum IMIF. July 2014

Regarding: EBA/DP/2012/03 Draft Regulatory Technical Standards on Prudent Valuation under Article 100 of the draft Capital Requirements Regulation.

ISO INTERNATIONAL STANDARD. Risk management Principles and guidelines. Management du risque Principes et lignes directrices

EBA/CP/2017/17 31/10/2017. Consultation Paper. Draft Guidelines on institution s stress testing

Appendix C. Regulatory Compliance Matrix DRAFT 2018 OR IRP

Basel Committee on Banking Supervision. Consultative Document. Stress testing principles. Issued for comment by 23 March 2018

Exam Duration: 2 hours and 30 minutes

ADES International Holding Ltd (the Company )

WORK PLAN AND IV&V METHODOLOGY Information Technology - Independent Verification and Validation RFP No IVV-B

PROMONTORY, AN IBM COMPANY QUANTITATIVE SOLUTIONS CASE STUDY: Stress-Test Model Development

Pillar 2 - Supervisory Review Process

KEY. riskupdate PREDICTIONS FOR Risk Reward. Jan 2011

Remuneration Guidelines for UK Investee Companies

Consultation Paper CP26/17 Model risk management principles for stress testing

ECB guide to internal models. General topics chapter

Consultation on how companies should demonstrate long-term financial resilience

ANNUAL PERFORMANCE REPORT DATA ASSURANCE PLAN 2015/2016

frameworks Mike Ashcroft & Debbie MacDonald, KPMG

The Integrated Support and Assurance Process (ISAP): detailed guidance on assuring novel and complex contracts

Fluid Tailings Management for Oil Sands Mining Projects. 1 Introduction AER Requirements What s New in This Edition...

GUIDANCE NOTE FOR DEPOSIT TAKERS (Class 1(1) and Class 1(2))

3. In particular, the Roadmap explains the approach that the EBA is planning to take in relation to the following:

RISK MANAGEMENT POLICY

Comments on Consultation paper 25 (CEIOPS-CP-02/08) CEIOPS Draft Advice on aspects of the Framework Directive Proposal related to insurance groups

Federal Reserve Guidance on Supervisory Assessment of Capital Planning and Positions for Large Financial Institutions.

Consultation Paper. Draft Guidelines

Statement of Compliance with IOSCO Principles CitiFX Benchmark. Citibank, N.A. London Branch

STATE OWNED ENTERPRISES REMUNERATION GUIDELINES

Nitric Acid Production Project Protocol Version 1.0 ERRATA AND CLARIFICATIONS

Consultation: Reporting and rating NHS trusts use of resources

Consultation paper on technical aspects of diversification under Pillar 2

Guidance Note: Corporate Governance - Audit Committee. March Ce document est aussi disponible en français.

Session 49 PD, Model Validation. Moderator: Sebastien Cimon Gagnon, FSA, CERA

ANNEX VI RESULTS SUPERVISORY BENCHMARKING PORTFOLIOS

Report to the European Commission on the Application of Group Supervision under the Solvency II Directive

Ergon Energy Corporation Limited

Guideline. Operational Risk Management. Category: Sound Business and Financial Practices. No: E-21 Date: June 2016

INTERAGENCY GUIDANCE ON THE ADVANCED MEASUREMENT APPROACHES FOR OPERATIONAL RISK

PRINCE2 Sample Papers

Final Report. Guidelines. on internal governance under Directive 2013/36/EU EBA/GL/2017/ September 2017

SADCAS POLICY ISO/IEC 17020:2012 TRANSITION

GE/GN8640. Risk Evaluation and Assessment. Guidance on Planning an Application of the Common Safety Method on. Rail Industry Guidance Note

REPORT 2015/077 INTERNAL AUDIT DIVISION

SREP Transformation The Deloitte approach. Deloitte Malta Risk Advisory - Banking

Business Plan

Final Report. Guidelines on ICT Risk Assessment under the Supervisory Review and Evaluation process (SREP) EBA/GL/2017/05.

NUCLEAR FUEL COSTS. Filed: EB Exhibit F2 Tab 5 Schedule 1 Page 1 of 14

Capital Modeling Principles and Practices in the Insurance Industry

Invesco Perpetual UK Smaller Companies Investment Trust plc

CENTRAL BANK OF CYPRUS

Auditing Standard for Islamic Financial Institutions No. 6

Questionnaire on the survey of market practices

2.4. Care should be taken to minimise the risk of any conflict of interest that might be seen to give rise to an unacceptable influence.

Contractor Performance Evaluation Procedure

Current challenges from Evaluation point of view - Introduction case studies

A guide to assessing your risk data aggregation strategies. How effectively are you complying with BCBS 239?

C H E C K L I S T F O R O R G A N I S A T I O N A L C A P A C I T Y A S S E S S M E N T ( C O C A )

GROUP AUDIT COMMITTEE TERMS OF REFERENCE

DCC Business Case for DCC activities during the Transitional Phase of the Switching Programme

FMS New York/ New Jersey Chapter Meeting January 14, The Impact of Models. by: Scott Baranowski

Loch Lomond & The Trossachs National Park Authority. Annual internal audit report Year ended 31 March 2015

IBM Watson Financial Services

British Bankers Association response to EBA consultation on Recovery Planning Templates

HUMAN RESOURCES COMMITTEE CHARTER

THORNEY OPPORTUNITIES LTD ACN AUDIT & RISK COMMITTEE CHARTER

Grant Thornton UK LLP

Operational risk systems and controls

Internal Models - A view from the Central Bank of Ireland

Bonsucro Benchmarking Protocol Version 1.0 May 2017

The Auditor s Response to the Risks of Material Misstatement Posed by Estimates of Expected Credit Losses under IFRS 9

Developing a Successful Product

BSA/AML Self-Assessment Tool. Overview and Instructions

Use of PSA to Support the Safety Management of Nuclear Power Plants

Consultation Draft of the International <IR> Framework

Final Guidance on Sound Incentive Compensation Policies

PKF Littlejohn LLP INTERNAL AUDIT EFFECTIVENESS REVIEW INSURANCE

CP ON DRAFT GL ON SUPPORT MEASURES EBA/CP/2014/17. 9 July Consultation Paper

Defence Health Governance Structure

Model Risk Management

Managing Strategic Initiatives for Effective Strategy Execution

Contract management of medical services. Department for Work and Pensions

Report. Quality Assessment of Internal Audit at <Organisation> Draft Report / Final Report

Proposed Attestation Requirements for FR Y-14A/Q/M reports. Overview and Implications for Banking Institutions

Internal Audit Performance

Human Factor in Functional Safety

Network Rail Limited (the Company ) Terms of Reference. for. The Audit and Risk Committee of the Board

PRINCE2 walkthrough and Roadmap. Dave Litten

Pl anning Meetings. The Risk Management Plan

Reverse Stress Testing: A Case Study

Translate stakeholder needs into strategy. Governance is about negotiating and deciding amongst different stakeholders value interests.

Inspection Qualification and Implementation of ENIQ in Sweden

0.2 The Remuneration and Appointment Committee is a standing committee of the Supervisory Board.

Zurich Insurance Group

Igloo. Financial modeling software for managing risk

Achieve. Performance objectives

The Merlin Principles. The Elements of each Principle

Transcription:

SOLVENCY II SCORING - model validation: probability distribution forecast & risk ranking (SQS) Timing Q3 2010 Q1 2011 Q3 2011 By Q4 2011 Score 1 2-4 (design/evidence) 5-7 (evidence/build) 8-9 (testing/sign off) 10 (fully in use/bau) Progress Agent demonstrates little understanding of Little or no progress made in design Evidence available is insufficient to address any of key areas Agent demonstrates a reasonable understanding of At least some of key areas addressed and evidenced Number of points modelled Identification and description of component distributions underlying the PDF of Basic Own Funds ( BOF ) which are not based on all available information or which generate only key points Criteria for ensuring that the number of simulations will generate a PDF of BOF with sufficient points for precise estimation of the 99.5th percentile Reasons for selections If the PDF generates only key points, demonstrate that the methodology Reflects current knowledge, or can be justified on the basis of proportionality Agent demonstrates clear and detailed understanding of. All of the key areas addressed and most evidenced. Number of points modelled Description of any assumptions or methodologies used to enrich the PDF or underlying distributions. Reasons for selections If the PDF generates only key points, establish a process to ensure that the methodology continues to Meet or exceed generally accepted market practice Compensate with additional measures for any resulting shortcomings in IM Agents are close to finalising their subject to testing / sign off approvals. Number of points modelled PDF methodology testing and validation complete and documented Sign off by actuarial/risk management on the methodology and the process for ensuring that it continues to meet the criteria of (5.54) Agents have completed the design, build and test of the element above. Nothing further required to be done except follow process established for regular reviews (unless change). Number of points modelled All Solvency II implementation met and regular reviews and maintenance started Is more appropriate than alternative methods that would generate more points Risk ranking Risk ranking Risk ranking Risk ranking Description of the metrics used to rank risk and how they will be produced from the IM Evidence that the ability of the IM to rank risk meets the criteria in (5.221) of coverage, resolution, congruence, consistency Sign off on process for using IM to rank risk All Solvency II implementation met and regular reviews and maintenance started Difference between scores in each band should reflect the number of key areas addressed and quality

BLANK PAGE

SOLVENCY II SCORING - model validation: methodological adequacy (SQS) Timing Q3 2010 Q1 2011 Q3 2011 By Q4 2011 Score 1 2-4 (design/evidence) 5-7 (evidence/build) 8-9 (testing/sign off) 10 (fully in use/bau) Progress Agent demonstrates little understanding of Little or no progress made in design Evidence available is insufficient to address any of key areas Agent demonstrates a reasonable understanding of At least some of key areas addressed and evidenced Process to determine distributions and select parameters Selected distributions Reasons for selections Agent demonstrates clear and detailed understanding of. All of the key areas addressed and most evidenced. Process to determine distributions and select parameters Selected distributions Reasons for selections Agents are close to finalising their subject to testing / sign off approvals. Sign off by the relevant committees on the criteria and their application Agents have completed the design, build and test of the element above. Nothing further required to be done except follow process established for regular reviews (unless change). All Solvency II implementation met and regular reviews and maintenance started For all of the above: Develop criteria for actuarially and statistically adequate methods, which may refer to definitions of Applicable, Relevant, Appropriate, Transparent, Up to Date, Detailed and Parsimonious, and Robust and Sensitive. Shortcomings in methodology and how dealt with For all of the above: Full evidence for how the criteria for adequacy would be used in the processes/selections listed above. Draft evidence for how the criteria for adequacy would be used in the processes/selections listed above. Difference between scores in each band should reflect the number of key areas addressed and quality

BLANK PAGE

SOLVENCY II SCORING - model validation: methodological consistency & credibility (SQS) Timing Q3 2010 Q1 2011 Q3 2011 By Q4 2011 Score 1 2-4 (design/evidence) 5-7 (evidence/build) 8-9 (testing/sign off) 10 (fully in use/bau) Progress Agent demonstrates little understanding of Little or no progress made in design Evidence available is insufficient to address any of key areas Agent demonstrates a reasonable understanding of At least some of key areas addressed and evidenced Process for ensuring methodological consistency Develop criteria for consistency between methods used to calculate PDF and TP Consistency with TP and BP Develop process to identify and document any differences in the actuarial techniques and key assumptions used for PDF vs. TP and BP Agent demonstrates clear and detailed understanding of. All of the key areas addressed and most evidenced. Consistency with TP and BP Develop qualitative and (where possible) quantitative techniques to assess the materiality of any deviations between methods used for the PDF and TP Justification for any inconsistencies Complete process to explain, justify and document all deviations concerning methodology and assumptions Agents are close to finalising their subject to testing / sign off approvals. Completed testing and sign off by actuarial/risk management on the processes developed for methodological consistency & credibility Agents have completed the design, build and test of the element above. Nothing further required to be done except follow process established for regular reviews (unless change). All Solvency II implementation met and regular reviews and maintenance started Justification for any inconsistencies Develop process to explain, justify and document all deviations concerning methodology and assumptions Process for reviewing methodology Process for reviewing methodology Process for regular methodological reviews taking into account the relevant data, information on assumptions and alternative methods Process to demonstrate that recent progress in the development of methods is being tracked Develop criteria for the credibility of information used for the basis of the methods. The criteria may refer to Consistency, Objectivity, Competence, and Transparency. Difference between scores in each band should reflect the number of key areas addressed and quality

BLANK PAGE

SOLVENCY II SCORING - model validation: ASSUMPTIONS (SQS) Timing Q3 2010 Q1 2011 Q3 2011 By Q4 2011 Score 1 2-4 (design/evidence) 5-7 (evidence/build) 8-9 (testing/sign off) 10 (fully in use/bau) Progress Agent demonstrates little understanding of Little or no progress made in design Evidence available is insufficient to address any of key areas Agent demonstrates a reasonable understanding of At least some of key areas addressed and evidenced Process for identifying and justifying assumptions Draft process to identify and explain the assumptions underlying the IM with reference to their significance Agent demonstrates clear and detailed understanding of. All of the key areas addressed and most evidenced. Process for identifying and justifying assumptions Complete process of identification and explanation of assumptions in IM Identification of assumptions Agents are close to finalising their subject to testing / sign off approvals. Completed testing and sign off by actuarial/risk management on the processes developed for assumptions Agents have completed the design, build and test of the element above. Nothing further required to be done except follow process established for regular reviews (unless change). All Solvency II implementation met and regular reviews and maintenance started limitations model risk involved possible alternatives Complete documentation of all IM assumptions, their justification and the corresponding procedure Justification of assumptions vs. alternatives Identification of assumptions Draft process to document all IM assumptions, their justification and the corresponding procedure Justification of assumptions vs. alternatives Complete assessment of the materiality of the assumptions chosen and possible alternative assumptions, including where possible both qualitative and quantitative assessment Draft process to assess the materiality of the assumptions chosen and possible alternative assumptions, including where possible both qualitative and quantitative assessment Difference between scores in each band should reflect the number of key areas addressed and quality

BLANK PAGE

SOLVENCY II SCORING - model validation: data directory & data policy (SQS) Timing Q3 2010 Q1 2011 Q3 2011 By Q1 2012 Score 1 2-4 (design/evidence) 5-7 (evidence/build) 8-9 (testing/sign off) 10 (fully in use/bau) Progress Agent demonstrates little understanding of Little or no progress made in design Evidence available is insufficient to address any of key areas Agent demonstrates a reasonable understanding of At least some of key areas addressed and evidenced Source, characteristics, use of data Draft directory of any data used, specifying its source, characteristics and usage Data quality criteria and thresholds Specify own definition for data quality based on the criteria of accuracy, completeness, and appropriateness Data quality review process Draft process for regular data quality checks to ensure criteria of (5.181) are met Process for validation of expert judgement with data Draft process for using expert judgement with data, including documentation, justification, explanation and validation Agent demonstrates clear and detailed understanding of. All of the key areas addressed and most evidenced. Source, characteristics, use of data Complete data directory Data quality criteria and thresholds Develop qualitative and/or quantitative criteria for the different data sets Data quality review process Complete data quality review process Process for validation of expert judgement with data Complete process for using expert judgement with data Develop process to demonstrate that expert judgement used in addition to or as a substitute to data meets the standards of (5.185) Process and standards for data updates Agents are close to finalising their subject to testing / sign off approvals. Completed testing and sign off by actuarial/risk management on the processes developed for data directory and data policy Completed data quality assessment Agents have completed the design, build and test of the element above. Nothing further required to be done except follow process established for regular reviews (unless change). All Solvency II implementation met and regular reviews and maintenance started Data policy Define minimum update frequency Draft data policy covering at a minimum the items in (5.186). The policy must specify actions to be taken in the event that data does not continue to meet the criteria for data quality. Identify events which trigger more frequent updates Data policy Complete data policy Difference between scores in each band should reflect the number of key areas addressed and quality

BLANK PAGE

SOLVENCY II SCORING - model validation: DEPENDENCIES (SQS) Timing Q3 2010 Q1 2011 Q3 2011 By Q4 2011 Score 1 2-4 (design/evidence) 5-7 (evidence/build) 8-9 (testing/sign off) 10 (fully in use/bau) Progress Agent demonstrates little understanding of Little or no progress made in design Evidence available is insufficient to address any of key areas Agent demonstrates a reasonable understanding of At least some of key areas addressed and evidenced Process for identifying, quantifying, challenging and reviewing dependencies Agent demonstrates clear and detailed understanding of. All of the key areas addressed and most evidenced. Process for identifying, quantifying, challenging and reviewing dependencies Agents are close to finalising their subject to testing / sign off approvals. Completed testing and sign off by actuarial/risk management dependencies Agents have completed the design, build and test of the element above. Nothing further required to be done except follow process established for regular reviews (unless change). All Solvency II implementation met and regular reviews and maintenance started Draft process to cover at a minimum Key variables driving dependencies Extreme scenarios and tail dependence Draft process completed Selected dependencies Completed support for existence of diversification benefits Reasons for selections Tests the robustness of the system on a regular basis Selected dependencies Support for existence of diversification benefits Completed description of additional measures for cases where only key points of the distributions are known Justification of the underlying assumptions Reasons for selections Draft description of additional measures in cases where only key points of the distributions are known Difference between scores in each band should reflect the number of key areas addressed and quality

BLANK PAGE

SOLVENCY II SCORING - model validation: risk mitigation techniques (SQS) Timing Q3 2010 Q1 2011 Q3 2011 By Q4 2011 Score 1 2-4 (design/evidence) 5-7 (evidence/build) 8-9 (testing/sign off) 10 (fully in use/bau) Progress Agent demonstrates little understanding of Little or no progress made in design Evidence available is insufficient to address any of key areas Agent demonstrates a reasonable understanding of At least some of key areas addressed and evidenced Risk mitigation techniques included in the model Draft evidence that the use of risk mitigation actually causes a reduction in net risk Agent demonstrates clear and detailed understanding of. All of the key areas addressed and most evidenced. Risk mitigation techniques included in the model Completed evidence that the use of risk mitigation actually causes a reduction in net risk Agents are close to finalising their subject to testing / sign off approvals. Completed testing and sign off by actuarial/risk management on risk mitigation techniques Agents have completed the design, build and test of the element above. Nothing further required to be done except follow process established for regular reviews (unless change). All Solvency II implementation met and regular reviews and maintenance started Consideration of the impact of restrictions or limitations that exist for intra-group risk transfer Validation against criteria for inclusion Validation against criteria for inclusion Completed evidence that risk mitigation techniques covered in the IM meet the criteria. Draft evidence that risk mitigation techniques covered in the IM meet the following criteria: Risk transfer takes place from an economic perspective Legal certainty, effectiveness and enforceability with documentation Liquidity and ascertainability of value Identification and assessment of secondary risks Difference between scores in each band should reflect the number of key areas addressed and quality

BLANK PAGE

SOLVENCY II SCORING - model validation: Financial guarantees and options and future management actions (SQS) Timing Q3 2010 Q1 2011 Q3 2011 By Q4 2011 Score 1 2-4 (design/evidence) 5-7 (evidence/build) 8-9 (testing/sign off) 10 (fully in use/bau) Progress Agent demonstrates little understanding of Little or no progress made in design Evidence available is insufficient to address any of key areas Agent demonstrates a reasonable understanding of At least some of key areas addressed and evidenced Identification of guarantees and options Identify all relevant financial guarantees and contractual options; Draft specification of how they will be modelled Modelling methodology for each Evidence that modelling methodology will be consistent with TPs Identification of expected non-contractual payments Agent demonstrates clear and detailed understanding of. All of the key areas addressed and most evidenced. Identification of guarantees and options Complete specification of how guarantees and options will be modelled Modelling methodology for each Complete evidence that the IM will account for payments not contractually guaranteed in a manner consistent with TP Agents are close to finalising their subject to testing / sign off approvals. Completed testing and sign off by actuarial/risk management on methodologies and governance on financial guarantees, contractual options and non-contractual payments Agents have completed the design, build and test of the element above. Nothing further required to be done except follow process established for regular reviews (unless change). All Solvency II implementation met and regular reviews and maintenance started Identify all expected payments, whether or not contractually guaranteed Modelling methodology for each Draft evidence that the IM will account for payments not contractually guaranteed in a manner consistent with TP

SOLVENCY II SCORING - model validation: Financial guarantees and options and future management actions (SQS) Timing Q3 2010 Q1 2011 Q3 2011 By Q4 2011 Score 1 2-4 (design/evidence) 5-7 (evidence/build) 8-9 (testing/sign off) 10 (fully in use/bau) Identification of future management actions Identification of future management actions Governance arrangements for each Establish governance framework for future management actions Modelling methodology for each Identify IM assumptions for future management actions and demonstrate conditions beyond agent s control are accounted for Governance arrangements for each Establish a governance framework around management actions Develop process to report significant deviations from planned management actions Modelling methodology for each Evidence that future management actions are: a. accounted for in a manner consistent to that used for TP b. based on assumptions that are objective, realistic and verifiable. Assessment of the materiality of future management actions Difference between scores in each band should reflect the number of key areas addressed and quality

SOLVENCY II SCORING - model validation: Calibration (CVP) Timing Q3 2010 Q1 2011 Q3 2011 By Q4 2011 Score 1 2-4 (design/evidence) 5-7 (evidence/build) 8-9 (testing/sign off) 10 (fully in use/bau) Progress Agent demonstrates little understanding of Little or no progress made in design Evidence available is insufficient to address any of key areas Agent demonstrates a reasonable understanding of At least some of key areas addressed and evidenced Process for ensuring appropriate calibration If the IM is not calibrated to the standard VaR 99.5% / 1 year calibration of the SCR then agents shall provide qualitative support for their alternative risk measure / time period. Justify the time horizon in context of the average duration of liabilities and the business model Agent demonstrates clear and detailed understanding of. All of the key areas addressed and most evidenced. Process for ensuring appropriate calibration Demonstrate that all significant risks over a one year period are properly managed. Complete methodology for demonstrating equivalent level of protection. The methodology must include a sufficient level of validation. Agents are close to finalising their subject to testing / sign off approvals. Completed testing and sign off for methodology for showing equivalent policyholder protection. Agents have completed the design, build and test of the element above. Nothing further required to be done except follow process established for regular reviews (unless change). All Solvency II implementation met and regular reviews and maintenance started. Ensure that the data used for shorter time periods (less than 1 year) is appropriate If the IM does not explicitly produce the SCR at the standard calibration then the agent must develop a draft process to demonstrate equivalent policyholder protection according to the criteria of (6.55). Determine a schedule to demonstrate policyholder equivalence on at least an annual basis. Difference between scores in each band should reflect the number of key areas addressed and quality

BLANK PAGE

SOLVENCY II SCORING - model validation: Validation (CVP) Timing Score 1 Progress Agent demonstrates little understanding of Little or no progress made in design Evidence available is insufficient to address any of key areas Q3 2010 Q1 2011 Q4 2011 By Q1 2012 2-4 (design/evidence) 5-7 (evidence/build) 8-9 (testing/sign off) 10 (fully in use/bau) Agent demonstrates a reasonable understanding of At least some of key areas addressed and evidenced Purpose and scope of validation Define a policy scope to include as minimum the points in (8.127). Identification of any parts of IM framework not covered by the policy and why Draft process for both quantitative and qualitative assessment of the validation test results and how they will be used to gain comfort that the IM is appropriate. Validation tools used Summarize the validation tools to be used, which must include at a minimum those in (8.54). Frequency of validation process Draft schedule for the validation process. Governance of validation results Draft policy for governance of the validation results covering: Responsibilities for validation tasks Reporting of results of validation tests Criteria and path for escalation of results Senior management involvement in the validation process Independent review Set out how the independent review is used within the validation process. Documentation Draft documentation of the policy addressing how the policy will be carried out and the responsibilities. Agent demonstrates clear and detailed understanding of. All of the key areas addressed and most evidenced. Purpose and scope of validation Explicit consideration of any expert judgement. Statement of the goals and measures of backtesting. Validation tools used Provide reasons for why the selected validation tools are appropriate. Frequency of validation process Criteria requiring additional validation checks beyond those regularly scheduled. Limitations and future developments Description of the limitations of the current policy. Planned developments to meet identified limitations. Independent review Detail how the review is independent and how independence will be maintained. Documentation Difference between scores in each band should reflect the number of key areas addressed and quality Completed validation policy document. Agents are close to finalising their subject to testing / sign off approvals. Completed testing and sign off by actuarial/risk management on validation policy. Signed report of validation tests by actuarial/risk management. Draft (end of August) and final (end of October) validation reports submitted to Lloyd s. Agents have completed the design, build and test of the element above. Validation report updated as appropriate to reflect progress against any gaps and Lloyd s review feedback Nothing further required to be done except follow process established for regular reviews (unless change). All Solvency II implementation met and regular reviews and maintenance started.

BLANK PAGE

SOLVENCY II SCORING - model validation: Profit & Loss Attribution and backtesting (CVP) Timing Score 1 Progress Agent demonstrates little understanding of Little or no progress made in design Evidence available is insufficient to address any of key areas Q3 2010 Q1 2011 Q3 2011 By Q4 2011 2-4 (design/evidence) 5-7 (evidence/build) 8-9 (testing/sign off) 10 (fully in use/bau) Agent demonstrates a reasonable understanding of At least some of key areas addressed and evidenced Profit and loss attribution methodology Draft P&L attribution methodology capable of explaining a large part of annual P&L. Definition of P&L consistent with PDF. Application of P&L attribution Draft process for application of results of P&L attribution to Validation Management of business/ Use test. Governance process over P&L attribution Demonstrate that classification of risks for P&L attribution reflects the risk profile of the agent. Backtesting process Draft specification of a backtesting process that covers The steps in (8.150) Analysis of backtesting results not above trigger event. Definition of significant deviations between model results and reality Identification of the reasons for divergence. Trigger events Draft definition of trigger events. Agent demonstrates clear and detailed understanding of. All of the key areas addressed and most evidenced. Profit and loss attribution methodology Detail and explain the differences of the profits and losses between P&L attribution and those reported in accounting systems. Complete P&L attribution methodology Governance process over P&L attribution output Process for escalating to management body if the results from the P&L attribution do not reflect the risk profile of the agent. Procedure for improving the model if P&L attribution indicates that the model does not reflect the risk profile adequately. Backtesting process Completed specification of backtesting process including A defined escalation path for significant deviations Evidence that it will be applied at various levels of business A process for commonsense comparison between prediction and realization where expert judgement has been used Schedule for backtesting. Trigger events Completed definition of trigger events. Difference between scores in each band should reflect the number of key areas addressed and quality All S2 implementation met and regular reviews and maintenance started Completed testing and sign off by actuarial/risk management on P&L attribution and backtesting. Agents have completed the design, build and test of the element above. Nothing further required to be done except follow process established for regular reviews (unless change). All Solvency II implementation met and regular reviews and maintenance started.

BLANK PAGE

SOLVENCY II SCORING - model validation: model robustness and stress & Scenario testing (CVP) Timing Q3 2010 Q1 2011 Q3 2011 By Q4 2011 Score 1 Progress Agent demonstrates little understanding of Little or no progress made in design Evidence available is insufficient to address any of key areas 2-4 (design/evidence) 5-7 (evidence/build) 8-9 (testing/sign off) 10 (fully in use/bau) Agent demonstrates a reasonable understanding of At least some of key areas addressed and evidenced Process to establish robustness Identify and document the key assumptions in the IM. Description of any other sensitivity tests on the model. Governance process over robustness testing output Process in place to escalate results of sensitivity tests to senior management. Process for reviewing and applying results of sensitivity tests, in particular to parts of IM relying on expert judgement Stress and scenario testing process Draft description the stress and scenario methodology. Governance process over stress and scenario testing output Draft process for monitoring, assessing and updating S&S testing. Detail the responsibilities of the senior management involved in overseeing the S&S testing programme. Process for comparing S&S results to risk tolerance limits Agent demonstrates clear and detailed understanding of. All of the key areas addressed and most evidenced. Process to establish robustness Process to assess and evaluate significant changes in output resulting from small changes in parameters. Completed sensitivity testing process Governance process over robustness testing output Process for regular review of sensitivity testing of results. Completed governance process for sensitivity testing. Stress and scenario testing process Explain why the selected S&S tests are adequate. Completion of any S&S tests specified by Lloyd s Process for reverse stress testing. Governance process over stress and scenario testing output Detail process for senior management involvement in S&S programme. Agents are close to finalising their subject to testing / sign off approvals. Completed testing and sign off by actuarial/risk management on model robustness and stress & scenario testing. Reverse stress tests signed off by the board. Agents have completed the design, build and test of the element above. Nothing further required to be done except follow process established for regular reviews (unless change). All Solvency II implementation met and regular reviews and maintenance started. Difference between scores in each band should reflect the number of key areas addressed and quality

BLANK PAGE

SOLVENCY II SCORING - Model validation: external models & data (EMD) Timing Q3 2010 Q1 2011 Q3 2011 By Q4 2011 Score 1 2-4 (design/evidence) 5-7 (evidence/build) 8-9 (testing/sign off) 10 (fully in use/bau) Progress Agent demonstrates little understanding of Little or no progress made in design Evidence available is insufficient to address any of key areas Agent demonstrates a reasonable understanding of At least key and some of additional areas addressed and evidenced Two key areas identified: 1. Identification of all material external models as appropriate to the syndicate, including cat models, ESGs, and reserving models and data provided by third parties (eg broker analyses) 2. Identification of all material external data sets as appropriate to the syndicate, including RI credit factors and sources Four additional areas identified: 1. Outline role of the external models and data within the scope of the internal model, including evidence of the potential materiality of the EMD 2. Draft documentation around the justification of using EMD vs. internal models and data 3. Draft methodology to demonstrate understanding and limitations to EMD 4. Draft outline of review and validation process Agent demonstrates clear and detailed understanding of Comprehensive and structured identification of all external models and data sets is evidenced Draft documentation to explain how used of EMD complied with Articles 120-126 Demonstrate how alternative models and data have, and will, be considered Review EMD to understand if there is any introduction of material risk that should be considered within the SCR Process fully drafted which creates a link between the use of EMD and The design and operational details of the model The model change process Internal model governance Agents are close to finalising their subject to testing / sign off approvals. EMD processes tested and signed-off For the review and validation process the use of expert judgement in relation to EMD must be fully documented and rationale explained Agents have completed the design, build and test of the element above. Nothing further required to be done except follow process established for regular reviews (unless change). All Solvency II implementation met and regular reviews and maintenance started. Model change system fully operational and in use as part of BAU Assurance process shown system to fully work Difference between scores in each band should reflect the number of key areas addressed and quality

BLANK PAGE