Need to know
- Regulators are showing impatience with the slow pace of adoption of BCBS 239 risk data aggregation and reporting principles.
- Only 3 banks out of 30 had passed muster in the Basel Committee’s latest review of BCBS 239 compliance. All three are believed to be US banks – one of them JP Morgan.
- Deutsche Bank flunked the qualitative portion of this year’s CCAR in part because of weak data management.
- Banks could be hit with capital add-ons, as well as restrictions on their business and capital distributions if they’re not in compliance.
- Banks were given three years to January 2019 to reach full compliance, but some complain that was too little given the sheer size of the overhaul required.
- There are no absolute criteria for determining compliance. But in reviews, regulators are emphasising the degree to which banks have automated data processes.
Regulators are beginning to lose patience with banks’ weak response to implementing BCBS 239, the Basel Committee on Banking Supervision’s gold standard for the calibre and collection of data at banks.
The rule on data quality and systems has both baffled and overwhelmed major banks, only three of which have managed to make the grade. The standard has been in effect since 2016, so the committee’s patience with the banks seemed almost an acknowledgement of the magnitude of effort and imagination needed to meet its rule.
But by some measures, banks are going backwards: two dealers who were found to be compliant with two principles in 2016 – on risk data completeness and the distribution of risk reports – have now been re-designated as materially non-compliant in Basel’s June 2017 review.
Regulatory forbearance looks to be ending, too. In its review, the committee urged national supervisors to escalate matters, availing themselves of capital add-ons and restrictions on business to prod banks into line.
The committee remained silent on whether any punishments had yet been meted out. But in the industry, local regulators have been doing the rounds, and making veiled threats.
“As far as capital charges go, regulators are talking about it,” says Dessa Glasser, principal at consultancy Financial Risk Group in New York and a former chief data officer for asset management at JP Morgan. “Whether they will or not, I’m not sure, but I do think the banks have to be worried about that.”
“The ultimate lever they have at their disposal is to increase Pillar 2 capital requirements,” the London-based chief data officer of an international bank says of his supervisors. “I don’t have first-hand evidence of it happening, but that’s my expectation.”
Bringing risk into the light
BCBS 239 arose from the wreckage of the financial crisis, when many banks were found to have no clear idea of their risk exposure because of weak data and reporting practices. The standard consists of 14 principles grouped into three categories: governance and infrastructure; risk data aggregation; and risk reporting. Risk data aggregation is the process by which banks collect and process data, a quality filter crucial to being able to measure risk.
No-one doubts the regime’s importance: Lourenco Miranda, head of model risk management for the Americas at Societe Generale, says: “If you can’t uniquely and precisely provide data with the granularity the Fed expects, you won’t be able to produce the capital forecasts as you should.”
The data in question is used to calculate Pillar 1 risk-based capital. If the data is too shaky, regulators could hit banks with Pillar 2 discretionary capital add-ons, to compensate for any uncertainty in the data.
A managing director at a Big Four auditing firm says he’s seen “a couple of situations where the assessment… of a bank’s ability to survive under a stressed scenario would have materially changed if the risk data aggregation was accurate.”
In the United States, the Fed is not believed to have used Pillar 2 add-ons to enforce BCBS 239 – though it does measure compliance during its on-site supervisory reviews. These reviews tend to focus on principle 1, governance, and principle 2, data architecture and IT infrastructure.
The ultimate lever they have at their disposal is to increase Pillar 2 capital requirements
London-based chief data officer of an international bank
Examiners are said to be especially interested in the controls banks place around manual intervention with risk data. If a bank is unable to explain how it ensures the quality of the data it reports, it could fail the qualitative portion of the Fed’s Comprehensive Capital Analysis and Review (CCAR) stress test, as Deutsche Bank found out this year. In rejecting the bank’s capital plan, the Fed cited “material weaknesses in… data capabilities and controls” among the reasons.
“Deutsche Bank’s capital plan was rejected exactly on the basis of its data and IT infrastructure, which is at the heart of BCBS 239,” says a former regulator.
Deutsche Bank declined to comment on whether its CCAR failure was linked specifically to its implementation of BCBS 239. The Fed also declined to comment on whether it used BCBS 239 non-compliance as an explicit factor in CCAR failures.
An executive in charge of BCBS 239 implementation at a Swiss bank says Finma has informed the bank that it has capital add-ons at its disposal, but so far has not said it would impose them.
However, he adds that Finma has told the bank that “any new models that we submit that will have a capital impact need to be assessed vis-à-vis BCBS 239 compliance”. Finma declined to comment.
The irony of banks passing risk capital tests, but failing the test on the very data underpinning those capital tests, is a fact not lost on them. And there is recognition in the industry that unless regulators do in fact punish financial institutions for not being compliant, banks won’t get fully on board.
“If as a result of being noncompliant it means that the capital, risk and limit decisions you’re taking are not sufficiently well-informed because you can’t rely on the accuracy and integrity of the numbers on which those decisions are being made, then there needs to be a penalty,” says the executive at the Swiss bank.
The Basel Committee grades compliance on a scale of 1 to 4, with 4 being fully compliant; 3 largely compliant; 2 materially non-compliant; and 1 indicating no serious effort to comply, or “not yet implemented”.
The committee’s June progress report reflected no more than a slight improvement in governance and infrastructure, and mixed results for risk data aggregation and risk reporting. It did not identify the three banks that met the BCBS 239 standard, but a Swiss bank executive believes they are US banks – one of them being JP Morgan. Neither the Fed nor JP Morgan commented.
On the continent, the European Central Bank’s review of BCBS 239 concluded not a single bank under its jurisdiction was in full compliance with the principles, and that the overall level of compliance was “unsatisfactory”.
And full compliance is unlikely anytime soon – the ECB noted several banks’ BCBS 239 programmes to get on track run through 2019 and even later.
The ECB review found weaknesses in all three of the categories covered by BCBS 239: governance and IT infrastructure; risk data aggregation and risk reporting. The weakness in data governance was due to the failure to define clear lines of responsibility for data quality, the report said.
Speak softly and carry a big stick
Despite the apparent lack of capital add-ons, regulators have certainly made their expectations clear during supervisory reviews, say banks. Take the Fed, for instance. If it isn’t satisfied with the data arrangements, it is likely to issue the bank with a “matter requiring attention”, or MRA, notice.
Vendors warn, however, that banks should not try to fix these MRAs without addressing the larger data management issues that led to the MRA in the first place.
Ambreesh Khanna, group vice-president of financial services analytical applications at Oracle, says banks are “trying to fix the MRAs by putting in patches instead of fixing the root cause”.
“When I talk to some of the larger banks, the focus is on fixing MRAs on basic issues like regulatory reporting,” he says. “A lot of banks in the US are scrambling to fix those issues, instead of taking a step back and saying, ‘the reason we have these MRAs is because we don’t have a single [data] repository for risk and finance’.”
The Fed’s reviews are also putting banks on the spot in exacting ways.
“During a supervisory review, it is not uncommon for a regulator to ask for a new exposure calculation, such as a newly defined stress scenario. They could then check response times for a real-life scenario,” says Falk Rieker, global head of industry business unit banking at vendor SAP.
“Furthermore, the newer, more granular data-driven regulations enable the regulator to assess data accuracy, data integrity and data aggregation during reviews.”
When regulators conduct on-site visits, they are likely to ask questions about how the bank is managing data to ensure what is being reported is accurate. Regulators are conducting “fire drills” that test banks’ ability to aggregate data and produce reports within a short time.
If you can’t uniquely and precisely provide data with the granularity the Fed expects, you won’t be able to produce the capital forecasts as you should
Lourenco Miranda, Societe Generale
“Finma gave us a fire drill exercise to check how good the numbers were,” says the Swiss bank executive. “The acid test is the quality, speed and effectiveness of the process by which numbers are driven out.”
An increasing area of focus for regulators is “data lineage”, or the tracking of data from origination through to reporting. The ECB conducted a data lineage analysis as part of its review, and the Basel Committee also noted in its progress report that several banks had made improving data lineage a priority.
“When you look at the principles in terms of how to aggregate the data, you need to look at data lineage, and you need to understand how the data is being transformed,” says Glasser at Financial Risk Group.
Many of the issues with risk data aggregation and reporting stem from systems and processes spanning multiple legal entities, businesses and countries. As banks have grown by acquisition over several decades, they accumulate legacy infrastructures that need to be woven into the existing IT fabric. The automation of the whole is crucial.
“The minute you take a bunch of analysts and they start pulling out spreadsheets to prove to the regulator that they have a handle on data, you’ve already lost the game,” says Khanna. “The only way you can prove to the regulator that you know what’s going on is if you have an automated system that reconciles data from all jurisdictions, legal entities, across all products, without anyone touching that data, all the way through to regulatory reporting.”
But the industry itself sees value in the effort. Unlike regulatory noise and expense, the solidity of data is fundamental.
“Banks overall see the value of the BCBS 239 principles, and have been creating policies and procedures for how they’re going to adopt them,” says Rob Lee, executive director at AxiomSL.
Some banks claim to have been able to speed up implementation by putting data governance under the purview of executive subcommittees – an approach Mizuho International, for instance, claims has helped break down internal siloes.
“What we are doing is making sure each business function understands why the data is relevant to the broader organisation,” says Gary Goldberg, chief data officer for Europe, at Mizuho International.
Why so slow?
Many elements have dragged on the adherence to BCBS 239, but perhaps none more than the “principles” part of it. Because BCBS 239 is a set of principles, there are no absolute criteria to determine compliance. It’s therefore up to regulators to interpret the regime’s principles, and then up to individual banks to figure out how the principles are embedded within other regulations.
“The question of whether a bank is compliant is not as straightforward as you might expect,” says Holger Harreis, a senior partner in McKinsey’s global risk practice. “The regulation is principle-based, so the metrics for measuring compliance are not concretely pre-defined.”
“These principles are inherently difficult to measure,” adds Thomas Kimner, head of global marketing and operations, risk management at SAS Institute in Washington, DC. “They’re difficult to implement and execute, and the fact that they have to be incorporated into other aspects of regulatory compliance does provide a challenge across the board.”
In fact, the Basel Committee itself acknowledges in its report that the slow progress is partly due to ever-rising supervisory expectations. In effect, supervisors have been raising the bar for compliance as they discover new practices in the industry, cherry-picking as they go. This occurs even as some banks struggle to meet or even understand the initial expectations.
Harreis says “banks are continuously improving their capabilities, which sets a constantly evolving bar across the sector. Supervisors raise their expectations as they see new best practices”.
Another element is time. The compliance date for BCBS 239 was January 2016, three years after the principles were issued. But given the complexity of global banks’ operations, which can enfold myriad products across multiple time zones, some thought three years was too soon.
Stacy Coleman, who represented the Federal Reserve on the Basel Committee group that drafted BCBS 239, recalls questioning the three-year timeframe.
“In talking with others in the group, it was a relatively short timeframe to make changes of this magnitude,” says Coleman, now managing director at Promontory Financial Group. “Supervisors wanted to push hard on banks to fix things. I thought five years was more realistic.”
Yet another ingredient is that supervisors themselves are flummoxed by the principles, as well as by the moving target that is technology.
“I don’t think they have a clue,” says one Swiss bank executive. “That’s more or less what they’ve said to us in our dialogues with them.”
Editing by Joan O’Neill
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe
You are currently unable to print this content. Please contact info@risk.net to find out more.
You are currently unable to copy this content. Please contact info@risk.net to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@risk.net
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@risk.net
More on Risk management
SEC leadership change puts Treasuries mandate under scrutiny
FICC clearing models approved, but critics think delay could revive prospects of done-away trading
Markets Technology Awards 2025: Untangling the knots
Vendors jockeying for position in this year’s MTAs, as banks and regulators take aim at counterparty blind spots
Risk Awards 2025: The winners
UBS claims top derivatives prize, lifetime award for Don Wilson, JP Morgan wins rates and credit
An AI-first approach to model risk management
Firms must define their AI risk appetite before trying to manage or model it, says Christophe Rougeaux
BofA sets its sights on US synthetic risk transfer market
New trading initiative has already notched at least three transactions
Op risk data: At Trafigura, a $1 billion miss in Mongolia
Also: Insurance cartels, Santander settlement and TSB’s “woeful” customer treatment. Data by ORX News
Cyber risk can be modelled like credit risk, says Richmond Fed
US supervisors may begin to use historical datasets to assess risk at banks and system-wide
The changing shape of risk
S&P Global Market Intelligence’s head of credit and risk solutions reveals how firms are adjusting their strategies and capabilities to embrace a more holistic view of risk