Validation
What We Offer
We can validate a firm’s existing policies and procedures OR develop a governance framework of policies and procedures that fully complies with the SR-11-7 model risk guidance and validate those. Get your money’s worth from your model developers and vendors through exceptional validation.
SRA has the expertise, resources, and methods to effectively challenge almost any financial, economic, statistical, or mathematical model to any level of rigor – in other words, we ask the right questions.
Effective Challenge
Cost-Effective Models and Development Procedures
Development Expenses Turn into Development Investments + Assets
The Two Questions We Ask
We have written and reviewed policies and procedures many times to fully know their necessity and value; which, gives us the understanding to explain and describe validation with two short questions:
1. Is the math right?
While clearly important, it has, unfortunately, been overemphasized because it’s the easiest to audit, and that’s been very costly.
This first question is the easier one to answer, and it involves checking the calculations, code, algorithms, and other mechanical aspects of a model’s construction, implementation, and processing. While clearly important, it has unfortunately been overemphasized because it’s the easiest to audit, and that’s been very costly.
Obviously, ensuring that the math is correct is necessary to have a valid model, but it’s not sufficient. Moreover, it has led to huge misallocation of resources, e.g., the hiring of inexperienced, junior “quants” as validators, who don’t know a term loan from a line of credit, and, therefore, cannot comment on the more important question: is the model conceptually sound? In other words, is it representationally-faithful and reliable?
At Spero Risk Associates we have deep business experience and expert quantitative skills. That combination allows us to validate nearly any financial model to any level of rigor. We can fully validate, where other firms can only partially “vet” or “verify”.
2. Is it the right math?
The regulatory definition of a “model” as a process with inputs and outputs is extremely useful in many situations, but this isn’t one of them.
This is the more important question, which involves conceptual soundness. It cannot be easily answered by the inexperienced in this topic.
The regulatory definition of a “model” as a process with inputs and outputs is extremely useful in many situations, but this isn’t one of them. Instead, by defining a model as a “simplified representation of reality,” it becomes obvious that an understanding of reality–the economic phenomenon or transaction–is crucial to determining whether the mathematical representation is faithful and reliable, i.e., whether it makes sense or not.
There is no substitute for deep and broad industry experience for determining if it’s the right math, but we can see the important aspects of reality are appropriately captured or where weaknesses exist. Fortunately, for our clients, we understand the business, not just the math.
Through this simple framework, we can also explain typical validation mistakes and how we avoid them…
Frequently, we see others make mistakes on two extremes:
- False Positives: validators, who lack industry experience, confuse being difficult (about stupid stuff) with effective challenge. Seemingly important findings that have little or no potential for adverse consequences—for anyone—are over-emphasized. It’s a lack of proportionality that’s usually oriented to easily-verified, non-conceptual issues.
- False Negatives: for short-term benefit, expedient validators provide a “light touch” to appease noisy developers or managers and possibly to hide their lack of experience and understanding. (Imagine if your physician skipped tests or results in hopes of not offending you or because he or she really didn’t understand possible illness or disease.) This does nothing to benefit the firm.
Both extremes waste a lot of time and a lot of resources, especially in the latter case, when a poorly-implemented or conceptually-unsound model could lead to failure costs associated with use or regulatory findings.
While the guidance lists three components of validation, executives need to understand that these components are not to be taken as independently important on their own or even mutually exclusive, e.g., initial outcomes analysis provides developmental evidence and, therefore, helps prove conceptual soundness. There are only about eight pages of guidance related to validation, across the industry those eight pages have spawned thousands and thousands of pages of validation policies and procedures, which are constantly being rewritten.
We like to think of the three main components of validation like Cerebus – validation encompasses the past, present, and future.
1. Evaluation of conceptual soundness, including developmental evidence.
2. Ongoing monitoring, including process verification and benchmarking. 3. Outcome analysis, including back-testing. |
Validations and validation reports provide crucial and, frequently, the only feedback to developers. Proper validations provide constructive criticism leading to more relevant and reliable models that improve decision-making and, therefore, profits.
The principals at Spero Risk Associates have been successfully applying the model risk guidance since its publication in 2011. We integrate all three pieces in our loss-forecasting and benchmark framework – but we have also successfully validated existing framework. Sometimes you may have some of the pieces completed and just be looking for an ongoing monitoring package which we are happy to custom tailor to your needs (and possible black box platform model outputs).