THE ROLE OF MODEL VERIFICATION IN MODEL RISK MANAGEMENT

Similar documents
Model Risk Management (MRM)

KPMG s financial management practice

CMMI-DEV V1.3 CMMI for Development Version 1.3 Quick Reference Guide

2010 ERP REPORT. Organizational Change Management

EXECUTIVE BRIEFING Achieving Risk Regulatory Compliance Overcoming the Data Integration Challenges

R.POONKODI, ASSISTANT PROFESSOR, COMPUTER SCIENCE AND ENGINEERING, SRI ESHWAR COLLEGE OF ENGINEERING, COIMBATORE.

Certified Identity Governance Expert (CIGE) Overview & Curriculum

Best Practices for the Architecture, Design, and Modernization of Defense Models and Simulations

Data maturity model for digital advertising

CMMI for Services Quick Reference

THE PROMISE SERVICE IT S HERE AND NOW

Axioma Risk Model Machine

How mature is my test organization: STDM, an assessment tool

The Critical Role of Talent Analytics in Successful Mergers and Acquisitions

SYSTEMS MODELING AND SIMULATION (SMS) A Brief Introduction

CMMI-SVC V1.3 CMMI for Services Version 1.3 Quick Reference Guide

CMMI V2.0 MODEL AT-A-GLANCE. Including the following views: Development Services Supplier Management. CMMI V2.0 outline BOOKLET FOR print.

3 STEPS TO MAKE YOUR SHARED SERVICE ORGANIZATION A DIGITAL POWERHOUSE

Driving Better Business Outcomes. Jason Dunn IT Project Management Office Freddie Mac September 25, 2015

This resource is associated with the following paper: Assessing the maturity of software testing services using CMMI-SVC: an industrial case study

Dynamic Simulation and Supply Chain Management

This chapter illustrates the evolutionary differences between

Chapter 6. Software Quality Management & Estimation

Chief Executive Officers and Compliance Officers of All National Banks, Department and Division Heads, and All Examining Personnel

Work Plan and IV&V Methodology

Using Measures and Risk Indicators for Early Insight Into Software Product Characteristics such as Software Safety

Implementing Analytics in Internal Audit. Jordan Lloyd Senior Manager Ravindra Singh Manager

Service Virtualization

Top 35 Reasons You Need Contact Center Performance Management

SOA Research Agenda. Grace A. Lewis

Session 42, Model Governance: What Could Possibly Go Wrong? Part I. Moderator: David R.W. Payne, MAAA, FCAS

DORNERWORKS QUALITY SYSTEM

Alpha 2 (Alpha Squared) Generating Outsized Returns in Private Equity with Two Layers of Analytics

Kseniia Jones Senior Manager Global Risk Advisory Deloitte UK

4/26. Analytics Strategy

active business INTELLIGENCE CONNECTING INFORMATION TECHNOLOGY TO WORKFLOW

Strategy Analysis. Chapter Study Group Learning Materials

Passit4Sure.OG Questions. TOGAF 9 Combined Part 1 and Part 2

CMMI for Acquisition Quick Reference

Model Risk Management in Financial Services

Transformation Enablement

How to Tackle Core (Legacy) System Challenges using APIs

Building smart products: best practices for multicore software development

Model Risk Management

TDWI strives to provide course books that are contentrich and that serve as useful reference documents after a class has ended.

Overview of Model Risk Control Operations

Analytics & Risk. careers.blackrock.com

Improved Risk Management via Data Quality Improvement

Network Optimization Handbook. Your Guide to a Better Network

Policy Administration Transformation

Testing: The Critical Success Factor in the Transition to ICD-10

Chapter 3 Prescriptive Process Models

WHITE PAPER. ERP and Enterprise Performance Management Best Practices

THE FIVE BUILDING BLOCKS OF AN EXCEPTIONAL WEB EXPERIENCE. Your guide to winning the personalization race.

A Measurement Approach Integrating ISO 15939, CMMI and the ISBSG

Accelerate Innovation by Improving Product Development Processes

ECM Migration Without Disrupting Your Business:

NCOVER. ROI Analysis for. Using NCover. NCover P.O. Box 9298 Greenville, SC T F

PART THREE: Work Plan and IV&V Methodology (RFP 5.3.3)

Turnkey or Time Bomb?

Enhancing Utility Outage Management System (OMS) Performance

MEETING THE CHALLENGE OF TRANSFORMATIONAL CHANGE:

Lecture 1. In practice, most large systems are developed using a. A software process model is an abstract representation

Standing up to the semiconductor verification challenge

Source-to-pay: Delivering value beyond savings

Software Engineering

Large Federal Agency Leverages IV&V to Achieve Quality Delivery for Critical Modernization Initiative

A buyer s guide to data-driven HR. Which approach is best for you?

Sharpening the tools of the trade: New Developments with FX Strategy back testing platforms

How I Learned to Stop Worrying and Love Benchmarking Functional Verification!

version NDIA CMMI Conf 3.5 SE Tutorial RE - 1

Healthcare Data Management for Providers

Amsterdam, The future of retail banking: the customers

Systems Engineering Concept

Take-aways from EY s series of Internal Audit Analytics roundtables over 2016

Viewpoint Link services to outcomes

Specialty Chemicals: Unlock Agility and Improve Customer Satisfaction Through Enhanced Production Scheduling

Risk Mitigation in a Core Banking Conversion

SCALING DATA MANAGEMENT TO MEET COMPLEXITY CHALLENGES: RIGHT-SIZING A SOLUTION TO FIT YOUR NEEDS

Wipro s PLM Harmonization Framework for Successful M&A

Service-Oriented Architecture

Design of an Integrated Model for Development of Business and Enterprise Systems

UPGRADE CONSIDERATIONS Appian Platform

Merger and Acquisition Integration

Solutions. Take Communication Surveillance to the next level

SuccessFactors Employee Central Side-by-Side Deployment with SAP ERP. White Paper

Infor PM 10. Do business better.

Practical Considerations for Various Model Validation Techniques Tuesday, June 27, :00 pm-5:00 pm

Concept of Operations. Disaster Cycle Services Program Essentials DCS WC OPS PE

Seven Ways Metals, Mining, & Materials Companies Turn Data into a Sustainable, Competitive Advantage

Sourcing Optimization Driving Supply Chain Decision Making

Disclosure Management

THE CUSTOMER EXPERIENCE MANAGEMENT REPORT & RECOMMENDATIONS Customer Experience & Beyond

Data management meets the C-suite

Governance Grows Up as Outcomes Become Key

Setting the Global HR Transformation Strategy

Capturing synergies to deliver deal value

A Product Innovation Platform and Its Impact on Successful PLM Deployments

22 ways to get the most out of OEE and lean manufacturing disciplines

The table below compares to the 2009 Essential Elements and the 2018 Enhanced Data Stewardship Elements

Transcription:

THE ROLE OF MODEL VERIFICATION IN MODEL RISK MANAGEMENT A QuantUniversity Whitepaper By Sri Krishnamurthy, CFA www.quantuniversity.com sri@quantuniversity.com

1 CONTENTS 1. Introduction:... 2 2. Model Verification and Model Validation... 3 3. A need for formal Model Verification in Model Risk Management... 4 3.1. Determining the degree of accuracy of implementation... 4 3.2. Adherence to regulatory requirements and guidance documents... 5 3.3. Adherence to internal model risk and deployment policies... 5 3.4. Understanding the art and science of model building... 5 4. Elements of Model Verification... 6 4.1. Scoping the Model Verification process... 6 4.1.1. Model Scope... 6 4.1.2. Model Specification -> Model Design -> Model Implementation... 7 4.1.3. Defining acceptance criteria... 7 4.2. Model Implementation Checks... 7 4.2.1. The Levers for the model: Input /Output Analysis... 7 4.2.2. Failure modes... 8 4.2.3. Determining the degree of correctness... 8 4.3. Model Policy and Process Checks... 9 4.4. Model Verification Reporting... 9 5. Summary... 9 6. References:... 10

2 THE ROLE OF MODEL VERIFICATION IN MODEL RISK MANAGEMENT 1. INTRODUCTION: The financial crisis of 2008 and the regulatory initiatives since then have put the spotlight on financial models and the need to manage model risk in financial institutions. More and more financial institutions are formally incorporating model risk management programs and are changing model development, testing and deployment processes to adhere to regulatory guidelines and best practices. Since the field is relatively new and best practices still evolving, organizations have had to try out various initiatives to customize model risk management programs that balance regulatory requirements and organizational objectives. As model complexity increases, model risk management becomes difficult since only a few groups in the organization know what a model actually does. Translating model designs to model implementation brings in additional challenges. First model development teams must ensure that they understand the model designs and can implement the designs accurately. For complex and proprietary financial products, quants find it challenging to communicate the design, intent and the appropriate use of quant models. Second, ensuring the model in question is implemented properly as per intended design and validating that the outputs are representative of the characteristics they are modeled after is complicated. For computer simulation models, these activities are under Model Verification and Model Validation [2] activities. Financial organizations relying are realizing the need to integrate rigorous model verification and model validation processes prior to deployment to understand and mitigate model risk. Though a lot of literature and practical guidelines are available for model validation, very little is available for model verification for financial models. In our previous article [1], we discussed best practices for the implementation of an effective model risk management program. In this article, we delve deeper into the role of model verification in model risk management. We start out by discussing the differences between model verification and model validation. We then discuss why formal model verification is necessary and make a case for the need of formal model verification as a part of model risk management. We then discuss four key elements that need to be considered as a part of a formal model verification process. We hope through this article, we can highlight the importance of model verification in building a robust model risk management practice.

3 2. MODEL VERIFICATION AND MODEL VALIDATION Though the concepts of model verification and validation are relatively new in finance, it is a wellestablished science and is widely adopted in fields such as design engineering and manufacturing. Whether it is designing structures that support skyscrapers or crash tests for cars, computer models are built to simulate actual conditions and to test how the model would behave when exposed to various input parameters and stresses. During this process, verification and validation is done to determine compliance to design and to determine if the model behaves as a real-world process would. One of the best definitions of verification and validation is found in a Department of Defense note [3]. Verification is defined as: The process of determining that a model or simulation implementation and its associated data accurately represent the developer s conceptual description and specifications. Validation is defined as: The process of determining the degree to which a model or simulation and its associated data are an accurate representation of the real world from the perspective of the intended uses of the model. Note that the primary goal of verification is to ensure correctness and to determine if the implementation accurately reflects the design specification. The goal of validation is to determine how closely the model reflects the behavior of the process/product it represents. In financial model risk management, many times verification and validation are clubbed under the umbrella of model validation. In model risk literature, Morini[4] was one of the first to draw attention to the role of model verification in model risk management. In this excellent reference for model risk, Morini discusses a scheme for model validation and notes the importance of model verification primarily as a check for internal consistency. Though model verification and validation are used in managing model risk, it is essential to treat both processes separately to ensure that a systematic approach can be taken to address model verification and validation. Let s discuss the need for formal model verification next.

4 3. A NEED FOR FORMAL MODEL VERIFICATION IN MODEL RISK MANAGEMENT One might argue that model verification easier in engineering and physical sciences where the laws of physics govern processes enabling the design of replicable processes based on test data. In finance, history rarely repeats itself and historical data provides limited value for verification. Even worse, for new innovations and custom financial engineering products (such as structured products), there isn t enough historical data to do proper model verification. If this is the case, what role does model verification play in finance for model risk management? There are multiple reasons to advocate for a formal model verification step in the model review and validation process. We list the main ones here: 3.1. Determining the degree of accuracy of implementation A model is typically made up of multiple components and dependencies. In fact, the Supervisory Guidance on Model Risk Management [5] notes: A model consists of three components: an information input component, which delivers assumptions and data to the model; a processing component, which transforms inputs into estimates; and a reporting component, which translates the estimates into useful business information. In order to ensure that the model works as designed, each component of the model should do its part and overall, the model should perform as per design. Having a formal model verification process facilitates verifying if each component and the model as a whole perform the role it was designed to perform. This includes ensuring that the data and inputs are correct, the processing component computes the requisite metrics as designed and the reporting or output component displays or stores results accurately. In addition, many models may not be fully developed in-house and require data and code sourced from third-party vendors and consultants. Even if the model wasn t fully developed within the organization, the impetus of ensuring that the model works as per design is still on the organization. A formal model verification step acts as a gating process to ensure that the model can be deployed. Even if the model s code wasn t sourced from outside, many organizations reuse code built for other models requiring testing to verify if the reused code provides the desired functionality. Also, models are typically developed by multiple people and sometimes different groups. A formal verification step can be used to verify if the integrated model works as desired. A key challenge organizations face when building models is dealing with legacy code. When I worked for a large financial company, I realized the challenges of working with legacy code when building models. The project required using four libraries developed in three different languages and was developed over ten years by different groups and from an acquired company. Companies which have gone through mergers and acquisitions, inherit legacy code when companies and departments merge. Sometimes, the original developers are no longer available but models and functions continue to be used. When legacy code is integrated with new code to build models, it is important to ensure that the model functions as per design. A formal model verification step helps ensuring this.

5 3.2. Adherence to regulatory requirements and guidance documents There have been many regulatory efforts to advocate best practices for model risk management. The Dodd Frank Act, Basel and the Solvency frameworks each address model risk and provide guidance on how institutions can incorporate model risk management programs. The Supervisory Guidance on Model Risk Management [5] is one of the key regulatory documents that provides extensive guidance on best practices and expectation of regulators when addressing model risk. Though model verification isn t explicitly addressed, many of the key aspects that model verification addresses are mentioned in this document. A key statement in section Model Development and Implementation of the OCC document [5] notes: Developers should ensure that the components work as intended, are appropriate for the intended business purpose, and are conceptually sound and mathematically and statistically correct. This essentially captures the essence of a formal model verification process. 3.3. Adherence to internal model risk and deployment policies In addition to regulatory requirements, companies have internal model risk policies and guidelines for internal groups. Many quantitative models developed in financial companies are proprietary and it is essential to vet them to ensure that companies aren t taking excessive risk by trusting models that aren t adequately tested and verified. In addition to the technical requirements of a model, quant groups are expected to implement business rules and organizational mandates into the model to ensure that using the model adheres to internal policies. Formal model verification can be used to check the adherence to various model risk policies adopted by the organization. In addition when models are deployed into production, models are modified to comply with information technology (IT) deployment policies. These may include authentication and authorization rules, logging, transaction management, exception handling etc. In addition to software testing, formal model verification can check adherence to IT deployment policies. Finally, it is widely acknowledged that quantitative models are complex and only a few people, typically the model development group, understand the intricacies of the model. The model validation group can adopt model verification into the model validation process to verify functionality of the models. Since the senior and supervisory management typically don t get to review of the details of specific models, incorporating formal model verification and setting expectations on clearance criteria at a policy level helps senior management ensure that the model risk policies are implemented. 3.4. Understanding the art and science of model building Model development is both an art and a science and so is model verification. The same model can implemented in multiple ways, using multiple algorithms and in different programming languages. Model verification requires understanding the design intent and the implementation of the model and thoroughly testing if the model performs well both under normal and stressed conditions. In addition, model verifiers must creatively device tests to verify if the model has weaknesses in handling edge cases or abnormal inputs. Failure and error handling analysis, i.e. whether a model fails gracefully with informative error messages or abruptly, is another area that would require model verifiers to creatively device tests that would cause models to fail and analyze failure modes. Leveraging the best practices of model development and programming language specific guidelines, a formal model verification step can uncover many issues that may not have been observed or encountered by model developers.

6 4. ELEMENTS OF MODEL VERIFICATION Now that we understand the need of model verification in model risk management, what constitutes model verification? Having worked with many clients verifying their models, we have developed best practices that we advocate to clients to help create and adopt a formal model verification program. We have devised a four-step model verification process that we believe every model verification process should have. The four steps are summarized in Figure 1 and elaborated in this section. Scoping the Model Verification process Model Implementation Checks Model Policy and Process Checks Model Verification Reportiing Figure 1: The Model Verification process 4.1. Scoping the Model Verification process Prior to model verification, it is essential that the scope of the model verification process be defined and all stakeholders (Supervisors, End-Users, IT, Model developers, Model Validators) aware of the process and the criteria used in the verification of the models. Let s discuss this in detail. 4.1.1. MODEL SCOPE We believe that the scope of model verification must include all the elements of the model that fall under the purview of model risk management. The verification of the model should include the model (source code and executables), input and output files, dependent libraries and execution environments that closely mirror how the model would be used. We also advocate reviewing associated documentation (including design documentation, methodology documentation and end-user guides), test plans and test results. How to systematically collect all the components required for model review? In manufacturing, a product assembly is typically associated with a Bill-of-materials (BOM). The BOM provides a comprehensive list of all the components that go into the final product and helps in ensuring that the finished product complies with the design. We advocate maintaining a Model BOM that captures all the software components and the associated dependencies that are required for a model. The scope of model verification should include verification of all the components listed and maintained in the Model BOM and the associated documents and artifacts (like drawings, workflows) that detail how the various components interact in the design of the model.

7 4.1.2. MODEL SPECIFICATION -> MODEL DESIGN -> MODEL IMPLEMENTATION Typically, quant researchers prepare a formal model specification that describes the model in detail, any modeling assumptions, mathematics and the modeling approach. The specification forms the basis for a design specification that details how the model would be implemented. Model prototyping is typically done to demonstrate the feasibility and the characteristics of the final model. The implementation team would then implement the production model in a chosen software platform. Since the final model is the realization of the model specification and design, model verification should include verification of the model along with the model specification and associated design documents. Note that the goal here is not to question the model s specification and approach. The goal is verify if the model is implemented as per the specification and design. Formal model validation is where the validity of the model to meet its intended purpose would be addressed. 4.1.3. DEFINING ACCEPTANCE CRITERIA To determine the correctness of implementation of the model, it is essential to first determine acceptance criteria. This helps communicate the intent and get agreement amongst stakeholders on the verification process. It also helps set the bar that needs to be achieved for the model to clear the model verification process. Acceptable accuracy of desired results and tests to be conducted needs to be defined prior to verification. A review of model assumptions and approximations need to be done to determine what tests are feasible and appropriate for the model. Performance criteria and service level agreements (SLA) need to be defined to ensure that the model meets design criteria and end-user requirements. It is recommended that the acceptance criteria be reviewed by the stakeholders prior to the model verification process to set appropriate expectations on the model verification process. 4.2. Model Implementation Checks Model implementation verification is not trivial and is both an art and a science. It requires expertise in programming and design and also requires understanding the mathematical model and modeling approach that was implemented. Since this is a complex task, a formal model implementation verification plan needs to be developed after reviewing the model specification and design and understanding the model acceptance criteria. Domain experts and implementation specialists should be consulted and the plan reviewed to ensure completeness of the plan. At a minimum, the plan should include the following three elements as a part of the model implementation checks. 4.2.1. THE LEVERS FOR THE MODEL: INPUT /OUTPUT ANALYSIS Most models are designed to process inputs and generate certain outputs. The input parameter space must be reviewed and tests designed to evaluate how the model performs for different values of inputs and outputs. These tests should include normal and permissible input ranges and invalid and stressed inputs. A review on whether the model performs as expected must be conducted and its impact assessed and documented. Special attention must be paid to check input validation, failure modes and controls to prevent unknown or unintended behaviors (see section 4.2.2). I have found that even basic what-if analysis (For example, specifying invalid correlations, invalid date/calendar specifications etc.) can reveal problems and inconsistencies in models and when feasible, I prefer to do thorough parameter sweeps to test how the model performs for a wide range of values. A review of test plans and test results conducted by the model development team helps determine the rigor of testing and if adequate testing was done during development. Any issues found should be captured and reviewed to

8 determine how critical the issues are and whether there are sufficient controls in place to address the risk posed by these issues. Refer our paper on quantifying model risk [6] for additional details on this topic. 4.2.2. FAILURE MODES Error and exception handling is key to ensure that a model is designed robustly and can be trusted to operate not only under normal conditions but also when a model encounters unanticipated conditions. A well-designed model must anticipate potential issues and implement controls to address unanticipated failures appropriately. Model verification tests must check how models behave if invalid/extreme inputs are presented to the model. When invalid inputs are presented, tests must be conducted to evaluate if the models communicate intelligible error messages back to the users, whether models log messages appropriately for replication and whether a model gracefully exits or crashes without messages. Failure testing is as much as an art as a science and isn t trivial. Many times models have multiple parameters and testing for all possible combinations of parameters is practically impossible. Model verifiers must creatively design tests to create failure conditions. Input validation is a key part of the model and it ensures that invalid inputs are caught early and doesn t let the model continue if invalid inputs are presented. This is a common practice in manufacturing and engineering. I recollect my experience as a mechanical engineer stress testing hydraulic pumps. The input hoses connected to the pump were designed to fail early and protect the hydraulic pump. The goal of stress testing was to ensure that when an abnormally high pressure (invalid input) was applied, the hoses fail before the high pressure fluid reached the hydraulic pump. Hoses are 1/100 th of the cost of the hydraulic pump and are easily replaceable in the field. We can draw parallels to input validation where checking for invalid inputs in the model before processing protects the model from potential model failures. In models expected to do transactions, output analysis is crucial. Models must be checked if state management of variables is done appropriately to ensure that in the event of model failures, transactions are rolled back and invalid states cleared. Tests must also be done to verify if appropriate logs and variable dump files are created to facilitate debugging and replication of issues. Unanticipated model failures may occur due to multiple reasons and it is impossible to forecast all potential failures. But with adequate planning, systematic testing for errors and ensuring that appropriate controls are implemented, model failures can be minimized reducing the risk of catastrophic failures. 4.2.3. DETERMINING THE DEGREE OF CORRECTNESS Checking if a model generates acceptable results is remarkably complex especially for complex financial products and for products that have limited historical data. The model verification and acceptance criteria define the degree of correctness expected in the models results. One way to verify if the model generated results meet acceptance criteria is to benchmark results from the model with an independently created model in an alternate language/platform. This could be prototypes quants developed or alternate simplified models that generates results. If a model is meant to replace an existing model, backward compatibility tests can be conducted and the existing model can be used as the reference model. In addition, interactive debugging and code and variable introspection could be done to verify calculations. Most programming environments provide variable and state inspection tools to facilitate incremental debugging. Unit tests and functional tests enable verifying atomic functionality in the models. We also advocate reviewing test plans and test results model developers have run to verify testing results and understand the rigor of testing adopted during and after development.

9 4.3. Model Policy and Process Checks Models not only are technical implementations of the model specifications but also encapsulate organizational business rules to adhere to the policies and controls adopted by the organization. The model risk policy of an organization typically defines the regulatory and internal model policies that need to be followed when implementing models. This could include security and authentication rules, data governance policies and how the model sources inputs, processes and saves results. The model verification step should include a model policy check to ensure that the model implementation adheres to the published model risk policy. In addition, many organizations have IT and deployment guidelines to ensure that the models are ready for production. The model verification step can be used to verify compliance with IT deployment policies and if the models are indeed ready for primetime. Finally, as we noted in [1], since model risk drivers are permeated throughout the model life cycle, for effective model risk management, model risk should be addressed throughout the lifecycle of the model. The model verification step can be used to determine if the model implementation, associated documentation, test plans, results etc. comply with the documented policies for mitigating model risk throughout the model life cycle. For example, if the organization has a policy on reviewing and addressing model issues through a formal process, verifying lists of outstanding issues, reviewing how well they are tracked, categorized, fixed and tested, sheds light on the rigor of addressing model risk and checking for evidence of the policy implementation. This could be used to identify gaps in testing and verification and to introduce improvements into the model lifecycle management processes. 4.4. Model Verification Reporting Once the model verification is completed, the results must be appropriately documented and shared with the respective stakeholders for review. We believe that a comprehensive report on model verification helps assess the maturity of the model is and if it is truly ready for deployment. Typically, the model risk committee reviews the results against the acceptance criteria and determines a plan of action to address issues that require attention prior to deployment. After agreement from the model development team, the plan of action should be documented and communicated to the stakeholders including senior and supervisory management. 5. SUMMARY In this article, we discussed the role of model verification in model risk management. We differentiated model verification from model validation and discussed the need of a formal model verification process. We then discussed four elements that we believe are essential components of a formal model verification process. Many of the aspects of model verification we discuss here are meant to provide practical tips to operationalize the best practices of model risk management and to facilitate addressing many model risk concerns regarding model verification. We encourage readers to consider customizing and adopting a formal model verification process as a part of their model review and validation processes. We believe that formal model verification enables taking a systematic approach to address model risk and assists in building a robust model risk management practice.

10 6. REFERENCES: 1. Krishnamurthy, S. (2014), The Decalogue. Wilmott, 2014: 58 61 2. http://en.wikipedia.org/wiki/verification_and_validation_of_computer_simulation_models 3. DoD Modeling and Simulation (M&S) Verification, Validation, and Accreditation (VV&A), ), DoD Instruction 5000.61, December 9, 2009. 4. Morini, M. (2011) Understanding Model Risk, in Understanding and Managing Model Risk: Practical Guide for Quants, Traders and Validators, John Wiley & Sons, Ltd 5. OCC 2011-12/SR 11-7: Supervisory Guidance on Model Risk Management 6. Krishnamurthy, S. (2014), Quantifying Model Risk. Wilmott, 2014: 56 59

11 BIO: Sri Krishnamurthy, CFA, CAP is the founder of www.quantuniversity.com, a data and quantitative analysis company. QuantUniversity recently launched a new offering called Model Risk Analytics and offers advisory services for model risk management. Sri has advised with more than 25 customers providing asset management, energy analytics, risk management and trading solutions and teaches quantitative methods and analytics for MBA students at Babson College. He is the author of several quantitative finance articles and is working on the forthcoming book published by Wiley titled Financial Application Development: A case study approach. He can be reached at sri@quantuniversity.com. For more information contact on our Model Risk contact: QuantUniversity, LLC A data and quantitative analysis company www.quantuniversity.com Sri@quantuniversity.com