Basel op risk modelling blow shifts focus to Pillar 2
Demise of AMA leaves industry needing risk-sensitive approach for calculating top-up capital, says consultant
Now that operational risk managers are coming to terms with the loss of modelling under Basel’s new SMA, or standardised measurement approach, banks’ focus is shifting to Pillar 2 requirements as a way of preserving a degree of sensitivity in op risk capital calculations.
Many large banks have repurposed their soon-to-be-redundant AMA models to help calculate their Pillar 2 estimates, which domestic regulators impose as an adjunct to core Pillar 1 capital. But the AMA, or advanced measurement approach, has faced many challenges.
Issues such as correlation, external losses and the use and application of scenarios have created large fluctuations in capital numbers quarter-on-quarter even within banks, never mind the variability across firms in the application of the methodology.
With banks facing the additional challenges of model risk, model complexity, and time pressure to deliver models, alongside the reasonable desire from the regulators for improved transparency, the time is right to review a genuine risk-sensitive operational risk capital methodology.
Undervalued
The operational risk discipline has always been regarded as the poor cousin of credit and market risk; yet for most institutions, its capital impact is second only to credit risk. Op risk managers must take account of the interplay of hundreds of systems within banks, combined with human fallibility both inside and outside the organisation, in making their capital calculations. This has led the op risk community to seek increasingly complex models as part of the process of improving estimation of losses.
The application of complex models can be justified where they improve estimation or add insight into what may be driving losses. Accordingly, continued application of improved modelling techniques should always be encouraged if they can be applied effectively in an organisation. The issue this creates is a lack of transparency, both internally and to external parties such as regulators and shareholders. This has driven the desire for the SMA from the regulators.
Amid disagreements over the SMA and the recent debacle over failed AMA submissions – at least two sizable banks are known to have struggled to get their AMA model accepted by their regulator for three years after initial submission and resubmission – an improved Pillar 2 modelling methodology needs to emerge to meet both the demands of the stakeholder community and the ability of senior managers to identify and improve their business units.
For the systemically important banks that have already collected significant amounts of data across their businesses, a strong data-driven approach provides the most transparent means of assessing operational risk capital requirements. For banks that do not have sufficient data, a programme of data collection and categorisation needs to be put in place.
For banks with sufficient data on their operations, the Pillar 2 capital model needs to be built according to set of key principles. First, it should be empirically based, using historical numbers where they can be justified; for example, where there has been little change in systems, management, or staff turnover. Second, it must display risk sensitivity, with the numbers linked to the losses in a straightforward way. Third, the model should include internal controls and business factors, together with external factors, such as economic forecasts. Finally, it should be benchmarked with scenarios, while also providing benchmarking with the organisation’s peer group.
Many of the major banks have already invested in the methodology to achieve several of the points above.
LDA the way
The use of the loss distribution approach, or LDA, to estimate extreme losses is a tried and tested approach that can be combined with the use of robust statistical methods, such as factoring the uncertainty in parameter estimation, in order for banks to produce a central capital forecast along with the statistical confidence interval of the estimation. With capital estimation driven from infrequent losses, not to include an estimate of the range of losses would be regarded as a serious oversight.
By applying a suitable approximation to the loss distribution analysis, the major drivers of risk within an organisation can be sufficiently captured, as a study by the Japanese Financial Services Agency has shown. An LDA-based methodology fits the key principles above: it is empirically driven, risk sensitive, and allows for the application of internal and external scenarios and stress testing. The methodology is benchmarked against the AMA and the SMA and allows for the adjustment of the confidence interval so that banks can tune it to their own particular risk appetite.
A workable methodology must be able to factor in both current and future controls. Within the LDA this is best achieved by adjusting the frequency and the severity distributions of the drivers of the losses. The frequency scaling can be applied for both internal scenarios and for external factors.
The factors that drive internal losses within banks are varied and the use of correlations – which may not hold at the tail end of op risk loss distribution estimates – has proved controversial and at best statistically ambiguous.
The simplicity of current approaches to loss event dependency warrants considerable improvement, and with machine learning techniques coming to the forefront of model construction, investing in enrichment of internal data – from systems to management surveys, employee satisfaction surveys and cultural assessments – may, in time, provide insights into loss drivers and the dependency of losses across an organisation. This area, however, is under development and some way off inclusion in Pillar 2 capital calculation methods.
The ability to assess the impact of scenarios in operational risk methodology is critical: this can range from closed or closing business units, further automation and compulsory stress testing. The use of scenarios to show the frequency or severity of loss distributions associated with that or similar business units provides the most transparent means of determining their impact. Risk managers should only use external losses as a benchmark to inform potential scenarios in an improved Pillar 2 model where they are confident in how the external loss arose. Otherwise the blind inclusion of such losses will significantly bias estimates away from a bank’s business model and its associated controls.
Simply put
Although many larger banks already have some of these elements in place as part of their current AMA, smaller banks may have steered clear of the AMA because of its complexities and cost to implement. These banks still would like an improved risk-sensitive, granular, business-relevant operational risk methodology that can be clearly communicated both internally and to their regulators. For the regulator’s part, a complex methodology is costly and time-consuming to review and respond to, and typically the more complex the model, the more reluctant the regulator is to provide leeway to the banks.
The implementation of such a straightforward methodology across all relevant banks, that meets all the criteria for a sound capital estimator, would allow banks and their regulators an insight into their relative operational efficiency. This would also help drive further investment in mitigating and understanding operational losses as well as ensuring the banks are sufficiently capitalised.
Chris Cormack is managing partner at a consultancy which focuses on quantitative solutions for financial risk management. Prior to this he was head of market risk for a large banking group and a quantitative risk manager for a UK hedge fund.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe
You are currently unable to print this content. Please contact info@risk.net to find out more.
You are currently unable to copy this content. Please contact info@risk.net to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@risk.net
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@risk.net
More on Comment
Op risk data: At Trafigura, a $1 billion miss in Mongolia
Also: Insurance cartels, Santander settlement and TSB’s “woeful” customer treatment. Data by ORX News
UST repo clearing: considerations for ‘done-away’ implementation
Citi’s Mariam Rafi sets out the drivers for sponsored and agent clearing of Treasury repo and reverse repo
Op risk data: Macquarie mauled by securities mismarks
Also: Danske’s costliest branch, tedious times for TD, and WhatsApp won’t stop. Data by ORX News
Climate stress tests are cold comfort for banks
Flaws in regulators’ methodology for gauging financial impact of climate change undermine transition efforts, argues modelling expert
Op risk data: Shady loans robbing Reliance of $1.1bn
Also: H20’s less-than-liquid holdings, Ripple ripped for $125m, and more WhatsApp slaps expected. Data by ORX News
FX algo users change tack to navigate market doldrums
BestX data finds traders ditching TWAP in favour of more opportunistic execution styles
Op risk data: Payday lender Skytrail sees $1.4bn disappear
Also: Cartel claims cost European bond dealers dearly, plus oil price gouging and crypto cover-ups. Data by ORX News
For US Treasury troubles, treat the cause not the symptom
Regulatory alarm about hidden risk in the Treasury futures market misses the point, fund association execs write