
Risk.net podcast: DTCC’s Lind on FRTB, data pooling and NMRFs
As many as 70 banks globally could adopt the internal model approach for market risk capital

Reports of the death of internal models may have been greatly exaggerated.
Tim Lind, head of data services at the Depository Trust and Clearing Corporation, believes the largest banks with the most complex trading books will still use internal models to calculate their market risk capital requirements under the Fundamental Review of the Trading Book (FRTB).
“The number of banks that go [for the] internal model approach – the IMA – we’re thinking somewhere in the neighbourhood of 60 to 70 banks globally,” Lind says. “It is a strategy that we think the more sophisticated, more cross-asset-class type of trading banks will pursue.”
In this inaugural Risk.net podcast, Lind, who joined the DTCC in January, discusses the complexities of FRTB and the data pools firms such as DTCC are creating to help banks opting for the IMA.
Lind is leading the DTCC’s effort to create a shared data pool for banks struggling with FRTB’s costly capital add-ons for non-modellable risk factors (NMRFs). Markets where data is thin or patchy are in greater danger of falling into the NMRF category. Bloomberg and IHS Markit are working on similar projects.
The creation of these platforms has not been easy. Progress was initially slowed by dealers’ concerns over privacy, data standardisation and a lack of clarity from regulators over governance. Large banks with bigger trade datasets argue they should not have to pay to use the data they submit voluntarily to pooling services.
Some regional banks are also reluctant to cough up local market data. In March, Risk.net reported six of Canada’s largest banks were working on an internal project – known as the Canadian Data Utility – to pool their trade data in local markets. Nordic banks are mulling a similar option. Lind addresses the development of global data pools alongside smaller, regional counterparts, and how the two can coexist.
FRTB is due to come into force in January 2022.
Lind also talks about the likelihood of regulators loosening modellability requirements for risk factors, which currently specify that consecutive price observations cannot be more than a month apart. Banks have taken issue with this, citing the seasonality of some assets. Some have suggested a wider gap between consecutive observations would lower the number of risk factors that qualify as NMRFs.
Lind pours cold water on this theory. Doubling the time between real prices would not lead to a significant increase in modellability, he argues.
“When we went through data on asset classes we integrated, changing [the] gap period doesn’t have a massive or material impact in terms of the percentage of notional value that is now modellable that wasn’t previously modellable,” Lind says. “It’s not moving it by 50%, where 50% of my notional value was non-modellable before and you went to two [price observations] in 60 [days] instead of a one in 30, it is not that kind of magnitude of improvement.”
Interview by Dan DeFrancesco
Index
3:00 – Update on FRTB and data pooling
4:51 – The relationship between regional and large data pools
11:51 – Managing multiple data pools
17:05 – Landscape of vendors’ data pools
23:35 – Use of proxy data
25:38 – Potential changes to NMRF
31:00 – Internal versus standardised approaches
35:45 – Timeline for FRTB
To hear the full interview, listen in the player above, or download. Future podcasts in this series will be uploaded to Risk.net. You can also visit the main page here, and subscribe in iTunes here.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe
You are currently unable to print this content. Please contact info@risk.net to find out more.
You are currently unable to copy this content. Please contact info@risk.net to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@risk.net
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@risk.net
More on Comment
Why the survival of internal models is vital for financial stability
Risk quants say stampede to standardised approaches heightens herding and systemic risks
Shaking things up: geopolitics and the euro credit risk measure
Gravitational model offers novel way of assessing national and regional risks in new world order
Start planning for post-quantum risks now
Next-gen quantum computers will require all financial firms to replace the cryptography that underpins cyber defences, writes fintech expert
Cool heads must guide financial regulation of climate risk
Supervisors can’t simply rely on ‘magical thinking’ of market discipline, says Sergio Scandizzo
Op risk data: Two Sigma pays the price for model mess
Also: KuCoin’s AML fail, Angola bribes bite Trafigura, and Trump’s green scepticism. Data by ORX News
How a serverless risk engine transformed a digital bank
Migrating to the cloud permitted scalability, faster model updates and a better team structure
Op risk data: Mastercard schooled in £200m class action
Also: Mitsubishi copper crunch, TD tops 2024 op risk loss table. Data by ORX News
Transforming stress-testing with AI
Firms can update their stress-testing capability by harnessing automated scenario generation, says fintech advocate