Lining up the fundamentals
The Fundamental review of the trading book (FRTB) should hold no fear for the enterprising, as it provides opportunities to revamp frameworks and implement ambitious structural changes. In this Q&A, sponsored by Asset Control, Murex, Vector Risk, CompatibL and Parker Fitzgerald, our panel of market risk experts discusses the impact of the systemic change, examines the technological challenges and asks how service providers can support the banking sector.
THE PANEL
- Martijn Groot, vice president – product management, Asset Control
- Pierre Guerrier, FRTB solution specialist, Murex
- Tim Rowlands, director of research, Vector Risk
- Nick Haining, chief operating officer, CompatibL
- David Kelly, partner, Parker Fitzgerald
- Steve O'Hanlon, chief executive officer, Numerix
- Ryan Ferguson, managing director and head of credit derivatives and XVA, Scotiabank
- Lars Popken, global head of risk methodology, Deutsche Bank
- Adolfo Montoro, director, risk methodology, Deutsche Bank
What impact will the new framework have on market risk management and the banking sector more generally?
Tim Rowlands, Vector Risk (left): In the 1990s, the concept of value-at-risk (VAR) shook up the whole market risk process, forcing banks to buy or build new independent risk measurement technologies. The new requirements under FRTB will usher in a new generation of market risk technologies. This should be seen as a once-in-a-generation opportunity to create an independent risk measurement environment focused on speed, drill-down and strategic what-if analysis to help shape the trading business – both to manage risk and to optimise profitability. Handled in the traditional way, this would be a high-cost, low-value imposition on the business and a potential disincentive to trade. However, organisations willing to grasp the opportunity to use the most modern technologies will be able to dramatically enhance the efficiency of their risk processes without sacrificing true risk management independence.
Nick Haining, CompatibL:By specifying that the internal model approach (IMA) approval is given at the desk level, and providing clear and unambiguous approval guidelines, the Basel Committee on Banking Supervision (BCBS) democratised IMA and made it potentially accessible to individual trading desks within mid-size firms that were previously unable to obtain internal model method (IMM) approval for the entire firm. For banks that do not pursue IMA, the standardised approach (SA) presents greater risk sensitivity than previously available methods. On the flipside, FRTB is calibrated so its capital requirement is prohibitively expensive for certain types of trading. In addition, its restrictions on the offset of sensitivities in different business lines may force consolidation of the trading desks and lines of reporting in order to take advantage of the offsets that would not be available for segregated books.
Steve O’Hanlon, Numerix: FRTB is a game-changer that demands a fundamental shift in the ways banks function and manage risk. The scale and scope of the regulation is massive, as it requires previously siloed parts of the enterprise to come together and work from a unified set of models and data – not to mention that many of those models must be revised to meet the new guidelines.
Anyone with experience in banking knows that, desk-by-desk and front office to back office, each part of a bank has its own flavour and approach to these types of calculations, has data on myriad systems and uses a disparate array of spreadsheets and software.
Additionally, derivatives valuation adjustments (XVA) calculations under the XVA desk are demanding more complexity, along with significant data aggregation and data quality and accuracy challenges.
Risk departments will now have the responsibility for and mandate of bringing together a single view of risk across the enterprise, becoming masters of risk data governance, data infrastructure and the technology to support the demands of rapid and regular reporting.
David Kelly, Parker Fitzgerald: One consequence of so much regulation has been the additional cost of production that is reflected in the large headcount increase in functions like market risk. The additional demands for FRTB are likely to follow a similar path of adding more data enrichment downstream – likely delivered offshore – with only a few banks rethinking their business models. The cost pressures of regulatory programmes are crowding out investments in the revenue-generating functions of the bank. To reverse the squeeze, data origination such as liquidity horizons and risk production need to migrate to the front office with the effect that, in the medium term, market risk will step away from many of its data production processes.
Ryan Ferguson, Scotiabank: The financial crisis sparked a reform of banking’s regulatory framework, and many of the reforms should reduce the likelihood of the taxpayer being tapped for a banking sector bailout in the future. Included in this set of reforms are increased capital buffers, total loss-absorbing capacity and increased clarity around bank resolution. The banking sector is spending a tremendous amount of time, effort and money to implement changes to the market risk management framework, where it isn’t clear that benefits will be commensurate. While using expected shortfall (ES) instead of VAR captures more tail risks, it does not directly address the concerns that led to the financial crisis.
Lars Popken, Deutsche Bank: The implications for the banking sector are significant. It is a real prospect that certain businesses may become uneconomical if the rules are applied to them in their current form. To that end, the provision of such services by banks may either become more expensive for customers or not be provided at all. In certain areas it may create increased concentration risk, as the number of banks providing services reduces, providing fewer options to customers. While the drive for comparability and consistency is a key tenet, this will be heavily dependent on the consistency of jurisdictional implementation. Furthermore, the complexity of the FRTB framework is unlikely to realise the comparability of risk-weighted assets between banks, which is one of FRTB’s main targets, according to the Basel Committee.
Despite these issues and the empirical evidence that has been presented to the Basel Committee, it appears that its approach is unwavering. Banks will need to come to terms with a world without internal models for some risk types – for example, credit valuation adjustment (CVA) – and made much more difficult to use for others. A widespread move towards SAs could lead to the risk of generating incentives for banks to pursue the same business models, potentially compounding overall systemic risk for the industry.
Additionally, the introduction of more conservative SAs and the discussions around the potential introduction of capital floors are likely to meaningfully increase the amount of capital required in the banking sector. This is in spite of the Basel Committee and other regulatory bodies saying that further reforms to the capital framework should not produce a significant overall increase in capital.
Market risk management will potentially see the benefit of the framework being implemented. Greater focus on data, risk and profit and loss (P&L) consistency, and assessing quality at the more granular level all establish a good way forward in strengthening market risk management. These factors lead to greater use of full revaluation and standardised risk factors, which provide greater risk management information. That being said, the harmony between risk management and capital management may diverge. For example, where a liquidity horizon established under the framework does not align with empirical evidence, we could see firms using the best available information for risk management purposes, even if the FRTB framework for capital requires something different.
Establishing capabilities to enable such distinctions also leads to better risk management through strengthening of the toolset and the flexibility it needs. The majority of the banking sector recognises FRTB as an opportunity to revamp its front-to-back infrastructure. That said, considering the high level of uncertainty attached to certain key components of the framework, regulators should carefully balance an ambitious implementation timeline, giving banks enough time to implement a robust framework.
What are the greatest challenges being faced by banks on the path to implementation?
Pierre Guerrier, Murex (right): We believe that FRTB is a game-changer for risk infrastructures and processes. This is due to a general lack of capacity in legacy risk systems and many trading systems to produce risk assessments consistent with trading figures and with the required accuracy across all instruments.
For banks seeking to conserve or obtain approval of an internal model, the risk theoretical profit-and-loss (RTPL) attribution is by far the greatest challenge. The P&L attribution metrics call for extremely high correlations between the P&L predictions of the risk system and the front office. If the figures are not reconciled, desks automatically lose internal model approval – without the warning shot of a capital multiplier increase, and even if they have good backtesting on all flavours of P&L. So, if not already completed, producing RTPL via legacy systems and assessing its quality is a bank’s most urgent task. Depending on this assessment, the risk system may need revamping to bring it closer to the front office or it may have to be replaced altogether. Front-to-risk integration is favoured, of course; however, not all legacy front-office systems are able to produce FRTB reports. Eventually, a seemingly straightforward reconciliation exercise drives a new target front-to-risk architecture, and lays out the path to deploy it.
Even banks aiming only for the revised standardised approach (RSA) face challenges. The RSA specification clearly directs users in defining input sensitivities and stress tests – for example, curvature and loss-given default – to achieve consistency across asset classes and source systems. For instance, pervasive basket and index drill-through capabilities – both for Greeks and for stress testing – call for best-of-breed front-office analytics.
Martijn Groot, Asset Control: FRTB poses significant market data challenges. The risk factor mapping requirements necessitate that firms are able to cross-reference between internal instrument taxonomies and the Basel risk factor classification with assignment of the regulatory liquidity horizon. For banks using an IMA, risk factor assessment requires an insight into overall market activity and confirmation on a minimum number of ‘real prices’. Proving this modellability entails integration of internal data and data available from trade repositories and new pooling services. On top of this, risk managers will want to track (and be proactively notified on) any changes in mapping or modellability status due to real-price availability or a change in the drivers of the liquidity horizon, such as the market capitalisation and credit rating. Generally, banks require a more structural approach to market data sourcing, quality management and operations.
Tim Rowlands: In any new project there is the overarching question of whether to buy or build. Historically, larger banks have chosen to build or to buy and customise. Smaller banks have generally looked for simple vendor solutions to meet minimum requirements and often believe they are locked out of the more sophisticated internal models due to cost and complexity. For large banks trying to build a new risk engine in-house or extend an existing system, it is hard to know if the solution will be fast enough to meet the IMA requirements. Some banks are hoping that extensions to their front-office systems will meet the requirements, but care is needed to ensure that the independent risk oversight function is not lost. Also, risk management groups need extra drill-down and what-if analysis tools over and above just generating the regulatory reports. It is possible to expend a large amount of effort only to realise that it is difficult to extend front-office systems to cover highly computationally intensive IMA runs, and that high-performance risk engines are hard to build in-house. Banks looking to buy off the shelf are faced with lots of ‘intention-ware’.
It is hard to know which vendor will actually deliver, and when. Do you stick with your vendor of choice even if they have nothing to show? Or do you embrace a new solution that is unfamiliar and requires a change of mind-set in the IT department? Many banks, especially mid-tier and smaller, have the option of employing just the SA. However, if potential capital savings dictate use of an internal model, banks with single end-of-day and reduced product and market coverage can meet the IMA requirements effectively. The P&L attribution challenge and non-modellable risks are likely to be less onerous, as they are usually dealing vanilla instruments in liquid markets. This is a great opportunity for these smaller banks to leapfrog their slower and larger rivals by using cloud technology, a software-as-a-service risk engine and a market data supplier’s FRTB data set. The challenge is choosing the right outsourced solution.
Nick Haining (left): Previously, the regulatory capital methodologies that imposed heavy demands on analytics and software performance (for example, IMM) were pursued only by the largest and most sophisticated firms. In contrast, the methodologies not based on internal models were typically much simpler and did not involve significant implementation challenges. The methodology expected to be used most widely by mid-size and smaller firms for FRTB and FRTB-CVA – the SA – requires calculating a large number of sensitivities and imposes a greater challenge than the methodologies these firms previously used.
Steve O’Hanlon: Banks are embarking on structuring their FRTB programmes and mobilising the necessary resources to assess what it means for them. From a solutions standpoint, there are complex interdependences to consider.
A key first step, and a daunting challenge in this process, is achieving a firm understanding of the business impact of the regulation. There is an immediate need for the results of impact studies reflecting real numbers FRTB teams can use to scale their institution’s response.
Next, they must consider how they will deal with managing cost, legacy systems and the unification of risk data. Overlapping and duplicate legacy systems present complexities and costly change management issues that create barriers to scalable growth. Siloed, black-box approaches typically used to underpin the architectural foundations of front-office, risk management and finance operations will be increasingly costly to maintain. Different products and business lines often have different analytic libraries, trade capture and data management with different technologies. To address these challenges, firms must make key functionality decisions to holistically support the front office, risk, market data and product control. As part of this, they must weigh up the costs and benefits of build-or-buy and, while full-scale systems may exist, most will have to take an approach that evolves over time.
David Kelly: The focus on approval at the desk level complicates what could have been a smooth transition from Basel 2.5 to FRTB, as it directly involves the participation of the front office in the risk management process. Desk heads are expected to prove they are in control, front-to-back, of the risk they originate. FRTB is not forgiving of risk that does not trade much or is entirely unobservable, and it will punish those products where the hedge creates noise. The desk head will need to actively manage this situation and demand much more analysis to help steer through this new regime. The main challenge for current risk infrastructure will therefore be how desk heads either switch to a much more decentralised and agile computational environment or move the heavy lifting back into the front office.
Ryan Ferguson: I think getting and maintaining IMA status is going to be a sizeable challenge. There will be a significant burden placed on regulators to evaluate dozens of new models from each bank they oversee within a very tight time frame. Aligning front-office data and models with risk management data and models will also be very time-consuming. I can see banks triaging their IMA deployment so that desks where the sensitivity-based approach (SBA) is untenable gain approval in time for the switchover to FRTB. Desks that can manage on the SBA will do so until development resources become available.
Adolfo Montoro, Deutsche Bank: FRTB is ‘fundamental’ for a reason. It introduces a multitude of new approaches and processes spanning from new methodologies to changed quality controls on market data and desk-level approvals. FRTB also induces computational demands that are a formidable challenge for any bank, regardless of whether it aims for the IMA or SA.
Although the FRTB standard text was finalised in January 2016, there is still a great deal of regulatory uncertainty embedded in the framework. Most of the questions submitted by various institutions via industry associations requiring clarification from regulators are still awaiting response. Such uncertainty is another dimension of the challenges that banks need to deal with when designing solutions that are flexible enough to cope with different interpretations of the rules provided only at a late stage by rulemakers. It is key at this stage for national regulators to actively engage with banks – via industry groups or on a bilateral basis – to refine the existing framework and to achieve a common interpretation of key components of the framework.
As the 2019 go-live deadline approaches, it is important that the infrastructure departments in risk, finance and technology do not rush into building out their current infrastructures. Instead, the framework requires front-office desks to play an active role. A new intra-bank interaction model needs to be established to provide oversight on data integrity, resource usage including central processing unit (CPU) grid time, portfolio risk management, end-of-day valuation, business strategy and transfer pricing.
Heightened levels of front-office desk engagement are key because FRTB increases the operational complexity and the capital cost of running market risk; therefore, desk heads will need to redefine the suite of products that provides value-add for clients at an appropriate cost of origination. This new interaction model needs to be defined before the framework can be properly implemented.
In a nutshell, FRTB requires a complete change to the operating model of the industry between front office, risk, finance and technology.
What new demands will FRTB place on firms’ IT resources and data?
Martijn Groot (right): FRTB raises the bar for market data quality, insight into lineage and control around business rules operating on the data. Quite simply, because of the additional requirements and data needs, the window for reruns is greatly reduced – banks need to get it right first time.
FRTB P&L attribution testing poses much more stringent demands on the consistency between front-office and risk data. Differences in snap times, market data sources and risk factor construction methods can easily lead to failed backtesting.
There is also the requirement for 10 years of historical data, while many banks currently use only one or two years for historical simulation. Banks need to baseline the 10 years initially, but also require backfilling functionality when onboarding new risk factors. On top of that, the most stressed period over that 10 years needs to be easily identified for ES calibration.
In addition, more control on any form of time-series operation is required. This includes risk factor and sensitivity calculation, the management of proxy rules and the management of shocks for regulatory and internal stress scenarios. Best practice would be to manage the calculation of these derived data sets in the market data system in order for the different risk and valuation systems to be supplied with consistent data.
Pierre Guerrier: We heard a lot about the CPU requirements related to the liquidity-adjusted ES. However, we believe that the key factor to the solution is proper software optimisation designed with FRTB in mind and, in particular, eliminating redundancy in calculations. Only a small part of the increased workload will be absorbed by natural performance gains associated with hardware turnover in the next three years.
Market data management is another concern – there is a real need for both data quality audit for non-modellable risk factor (NMRF) classification and increased data volumes – for the stress period and the default risk charge.
But the real resource pressure comes from the daunting task of upgrading diverse systems to their chosen target architecture within a very short time frame. It applies to both IMA and SA institutions, and pressurises both consultancies and system integrators globally. Securing adequate resources to execute this strategy is the most urgent challenge.
Tim Rowlands: The cloud is the future. Banks that decide to host solutions internally will face substantial costs. Many of these solutions require investment in large CPU or graphic processing unit (GPU) clusters that cannot be reused outside FRTB. Instead, IT personnel and resources must be diverted to managing cloud providers, data security and internet reliability. Banks need to be working with vendors to deliver pre-built, cost-effective services; and with regulators to bring them on board with the cloud and their ability to manage it securely. This will also require managing user expectations around the trade-off between cost and customisation, and a focus on service delivery.
Nick Haining: The greatest challenge will be to achieve the significant increase in software performance and computing power required to provide the sensitivities for the SA for market risk and SA-CVA, and for P&L attribution in IMA. With the recent changes in regulatory framework represented by FRTB and FRTB-CVA, the number of sensitivities that need to be computed has increased dramatically. Computing them for the entire portfolio is challenging for the market risk, and even more so for FRTB-CVA, which relies on sensitivities of CVA, a metric that requires Monte-Carlo simulation for a large number of trades. This challenge can be solved by analytics advances such as adjoint algorithmic differentiation (AAD), or by increasing the capacity for cluster or cloud computing.
Are there likely to be areas in which banks will require guidance or assistance from consultants, vendors and other service providers?
David Kelly (left): The delivery of FRTB will require collaboration and co-ordination across a number of expert groups in the front office, risk, finance, technology and regulatory engagements. Consultants that have considerable industry experience in trading or risk management and know what works with these programmes can help clients make the right strategic decisions around business selection, capital planning, vendor selection and target operating models, while helping quant teams deliver prototyping tools to gain insights on how to adapt to the new capital regime.
Martijn Groot: Banks will look for the most efficient path towards compliance with new regulation. Commonalities in regulatory requirements on data need to be taken into account in programme planning to ensure optimal cost effectiveness. Data providers can help with real-price assessment and additional tagging and flagging of quotes.
Market data integration providers can assist by supplying a fully auditable sourcing and quality management process. This should cover integrating internal and external sources, pre-mapping data to Basel and other regulatory risk factor classifications, and full transparency into risk factor status, sourcing and delivery via dashboards. On top of the packaged integration and population of risk factor data, banks also need the flexibility to track deviations – for instance, in cases where they want to stray from the regulatory floor liquidity horizon.
Steve O’Hanlon (right): The solutions market continues to evolve, as vendors enhance and launch new functionalities to help financial firms operate effectively under the FRTB regime. There is also an opportunity to empower banks to study their business on their own and not spend millions in consultancy fees.
To determine its path forward, today’s institution must build a blueprint of its desired future-state IT and architectural strategy. As banks are not yet in a position to say what their future state will look like, there are initial steps firms can take towards implementing an FRTB strategy that could serve as a basis for a broader enterprise-wide transformation.
FRTB business impact study solutions that are cloud-enabled will allow banks to upload their portfolios, use provided market data or their own data and, within a very short time, obtain a full picture of what FRTB means for them. This will allow institutions to grasp the business implications of FRTB immediately – understanding capital charges, how FRTB is impacting each of their desks from a profitability standpoint and how operational risk and market risk are coming into play.
As banks all have different approaches to handling FRTB, it is also important that solutions of this nature be highly scalable, flexible and incredibly fast.
Pierre Guerrier: We believe vendors and integrators have a key role to play because of the fundamental changes FRTB brings to risk infrastructures, data and processes.
Banks will turn to their vendors for compliance upgrades. The complexity and granularity of the new reporting are such that systems must not just provide mandatory raw data and number crunching, but also help roll out new business processes and streamline the operations of internal model-approved institutions. For instance, the RTPL will require daily production, but also validation and sign-off just like the hypothetical P&L. And the calculation of multiple ES needs to not only be CPU-efficient, but also resilient, auditable and operable.
Data providers must also help. NMRFs must be kept to as few as possible, and this will increase the need for multiple data sourcing from existing providers, security custodians and consensus of market participants. On remaining NMRFs, the challenge will be the calibration of ES-equivalent stress using scarce data. This requires bespoke methods for each risk type, and quantitative analytics departments may have to tap the resources of consultancies to kick-start the effort.
Tim Rowlands: Most banks have invested little in market risk infrastructure or human capital in recent years. This lack of internal resources will result in significant reliance on software vendors, market data suppliers and consultants to help them solve the challenges of FRTB. Multi-tenancy cloud solutions that allow banks to share hardware and receive automatic software updates and round-the-clock centralised support will revolutionise software projects such as FRTB. The move to a more prescriptive market risk environment means everyone has to calculate the same things, whether using the SA or IMA, so it makes little sense to develop these in isolation. Rate vendors are creating high-quality historic rate sets that will enable banks running the internal model to avoid extra capital hits on NMRFs. Consultants are able to use new cloud-based software tools to determine the impact of FRTB and to help banks plan their future trading strategies.
Nick Haining: Compared with the previous regulations, FRTB and FRTB-CVA documents require unprecedented levels of complexity for calibrating and testing the models for both SA and IMA. The FRTB document is also very specific as to the criteria that may cause a bank to lose its IMA approval. Having been exposed to a cross-section of portfolios and implementations, consultants and software vendors who work with multiple banks will have more diverse practical experience in implementing and running the new regulatory capital methodologies, compared with the in-house team working on a single implementation. This may help those vendors to provide greater insight into implementing FRTB and avoid typical pitfalls in running it. In addition, the computational complexity of FRTB may require advanced software solutions such as AAD, which vendors may be well positioned to deliver.
Ryan Ferguson: I think there are going to be resource constraints around model development and data management prior to go-live that will need to be met through third parties.
Lars Popken (right): The short timelines since the finalisation of the FRTB framework and the planned implementation in 2019 – as well as the extensive array of other regulatory-driven initiatives competing for similar resources – require banks to quickly ramp up and prioritise teams across the organisation. For example, methodology teams are significantly impacted by the new framework, as a number of new requirements need to be translated into concrete mathematics, rules and algorithms, which must be carefully and thoroughly designed, tested and documented.
These new methodologies need to be implemented into IT systems and will often require a fundamental change to IT infrastructures. Boutique consultancies can help to mitigate the potentially severe workload implications by introducing new technologies and state-of-the-art techniques.
In particular, synergies across banks can be achieved for data-intensive parts of the framework such as the observability assessment of risk factors for the internal model under the new NMRF component of FRTB.
Several data vendors have begun entering this space to propose data-pooling approaches on real transactions and committed quotes. Industry participants are now working towards agreeing common standards and vendor requirements. This co-operation will allow banks to leverage each other’s trading experience without exposing potentially sensitive information to competitors.
In conclusion, consultants, third-party providers and vendors are welcome partners in relieving the pressure on limited resources in key areas and supporting the condensed timelines.
What are the implications of moving away from VAR in favour of ES?
Nick Haining: The ES will present a considerably greater challenge to the historical or Monte-Carlo simulation models used for IMA than the older VAR-based methods did. In VAR calculation, the model has to be accurate only up to the VAR quantile, while for the ES it has to accurately represent the expected value of the distribution tail beyond the ES quantile. This means the models will have to capture the extremely low-probability events beyond the previously used VAR threshold. This presents a challenge not only to calibrating the IMA model, but also to the methodology used to backtest and validate it on a limited set of historical data in which such events may occur only a few times.
Martijn Groot:A very specific implication is that every outlier counts in the ES regime. Crudely put, a VAR process cuts off the distribution at the tail and provides an upper bound on the loss in a ‘business as usual’ situation. The ES metric is an expected tail loss and zooms in on the tail losses to estimate the expected loss in the worst 2.5% of cases. This means data errors directly hit the capital requirements if they end in the tail.
FRTB also poses a number of data model requirements, such as the need for daily look-through on funds if banks want them in the banking book. Value drivers of custom baskets or options on multi-underlyings also need to be clearly modelled.
David Kelly: The move to ES might improve the optics from a mathematical perspective, but it presents a step backwards in terms of daily risk management. VAR has many features that a purist can point out as inadequate; however, its redeeming feature is its simplicity – if the trader has this portfolio over that day, then the P&L experienced is the VAR. The direct link between VAR and realised P&L is reinforced through VAR backtesting, but is now broken thanks to the shift to ES.
Adolfo Montoro (left): The transition from a VAR to an ES measure has attracted a lot of attention. The Basel Committee’s primary reason for the move is to “ensure a more prudent capture of tail risks and capital adequacy during periods of significant financial stress.”
Indeed, severe tail events beyond the current VAR confidence level are, by definition, not directly captured in the current VAR metric, while they will have a significant impact on the ES figure. In practice, this will lead to various challenges – estimating the impact and likelihood of extremely rare events in the tail of the P&L distribution is a difficult task, subject to significant estimation uncertainty. This uncertainty is amplified by the relatively short but mandatory calibration horizon of one year. Due to the uncertainty and the corresponding statistical error bounds, the overall capital charge may fluctuate significantly over time, leading to challenges in the capital management process.
Backtesting the ES metric is significantly more challenging compared with VAR metrics, and various ways of assessing the quality of the ES are currently under discussion. The Basel Committee decided to indirectly validate the ES based on two VAR figures at different confidence levels. While the approach is a pragmatic one, the effectiveness and accuracy of capturing the extreme tail of the loss distribution is not assessed practically as part of the regular validation and eligibility assessment.
While the move from VAR to ES has several theoretical advantages and places more emphasis on proper tail-risk modelling, its practical merits must prove themselves over time, considering practical limitations such as stability concerns and statistical uncertainty of the estimated numbers, as well as the lack of a robust backtesting framework for the extreme tail of the loss distribution.
Ryan Ferguson: It’s going to be similar to how moving to the metric system was. ES has nice technical properties, but against that we have years of familiarity working with VAR. In the long run, the transition probably helps, but we will initially be in for a period of confusion while we get used to the new measure and its implications for capital allocation. ES also raises the bar significantly on data quality, as the whole tail of the distribution now shows up.
Do you expect certain business lines to expand or contract once FRTB is implemented?
Steve O’Hanlon: With or without FRTB, this is already happening – banks are exiting asset classes and entire business lines. But, in terms of FRTB, this depends greatly on capital impacts. With a solution such as Numerix FRTB, executives, heads of trading and heads of risk can respond to the top-level questions they are trying to get a handle on before transitioning to the development of an IT architectural strategy.
For example, there should be a close examination of FRTB capital costs and important questions should be answered upfront – determining which desks will remain operational, which business lines will be profitable under the new regulatory regime, which will have to be discontinued or restructured, and which asset classes will remain active.
On-demand reports for the standardised model that are fully automated, with the option to progress to the internal model if warranted by capital savings or other benefits, are also central to the solution. The cloud-based environment is also ideal for scalability testing, simulating realistic scenarios, conducting what-if analytics and using and testing different data sets.
Pierre Guerrier: Many institutions are looking at redefining their desk organisation to optimise the impact of FRTB capital changes. However, for some factors there is no room for risk diversification.
Regulators have tried to get a glimpse of this impact since the very first Quantitative Impact Study (QIS). One of the aims of FRTB is also to act as a disincentive to trading in products and markets perceived as comparatively risky since the financial crisis. However, QIS may not be accurate, and the official feedback given is not granular enough to reveal the fate of various business lines and whether they achieve regulatory goals. Competing banks and industry bodies also have an interest in not disclosing such granular data.
However, it is clear that the residual risk add-on in the RSA, combined with calibration on overly conservative liquidity horizons, will severely hit foreign exchange options trading, where digital and barrier payouts are extremely common, liquid and have never resulted in losses warranting a systemic adjustment. These activities have a strong incentive to move to, or remain under, an internal model – where they will not elicit material NMRF risk charges.
Correlation trading, whether in credit or equities, will suffer as it has nowhere to hide – under any approach, this business attracts either residual risk add-on or NMRF penalties, and a difficult RTPL reconciliation.
Nick Haining: Because of the unfavourable regulatory capital treatment of NMRFs in FRTB, and the strict criteria that must be met for a risk factor to be considered modellable, the implementation of FRTB will have the greatest impact on trading in anything other than the most liquid types of underlying. A bank that trades in an underlying with low trade volumes, perhaps even as a market-maker, may feel that there is reasonable liquidity in the underlying, but still suffer from the high capital impact of FRTB if this underlying does not fall under the modellable criteria in FRTB. In addition, the expensive convexity charge will penalise any trading in structured or other highly non-linear products, further accelerating the decline in their trading volumes.
David Kelly: Business areas that cannot evidence that they are in full control of all the risk they originate and warehouse will rightly struggle under FRTB. Products that fail to accurately attribute P&L – due to the existence of untraded input parameters or because they have model-generated noise around their production of sensitivities – will attract linear add-ons that accumulate capital with each new client transaction. Such product offerings that lock in capital for much of the duration of the deal will quickly become uneconomic and are likely to be very difficult to unwind to release capital for other ventures.
Ryan Ferguson (right): Liquidity may become even more concentrated, and risk that does not turn over frequently enough to cover its increased capital costs will see its liquidity further diminished.
This may become a concern for regulators in countries such as Canada and Australia, where the corporate bond market could be impacted as a result. When you add in other regulatory impacts, such as the net stable funding ratio, some of these businesses may have challenges generating sufficient returns.
I think it also exacerbates the problem of banks being ‘too big to fail’. Large, highly interconnected banks might have trading velocities high enough to make some of these marginal businesses work, whereas smaller banks with lower velocities may not be able to make sufficient returns and may need to withdraw from the market.
How will the relative attractiveness of the SA and IMA be affected by FRTB?
Pierre Guerrier: The Basel II SA was decried as being coarse and conservative. But it benefited from the simplicity of its implementation, especially compared with Basel 2.5 internal models. With FRTB, both approaches require rolling out complex projects, but at least the RSA becomes risk-sensitive, this deriving from parametric VAR. The internal models, on the other hand, are raising many concerns:
- Implementation is far more complex than RSA and Basel II internal model
- Capital saving over RSA is much lowered
- This saving is highly uncertain, since any desk can be tossed out of the approved scope at any time at short notice, thanks to the stringent eligibility criteria of RTPL attribution and backtesting
- The threat of a capital floor based on RSA, which is already implicit, since granular comparison of IMA and RSA results in the regulatory filings will expose banks with optimistic models to the mistrust of funding markets.
Nonetheless, banks already approved for IMA have no formal option to revert to SA, and for desks under pressure from faltering returns-on-capital, providing more capital is not viable. In addition, some particular activities, such as forex options, should greatly benefit from an internal model.
Tim Rowlands: The bottom line is that capital will increase under FRTB. For trading operations to be profitable, many banks will want to use the IMA despite the operational and technical hurdles of doing so. Not only large banks, but also several small and mid-tier banks have indicated to us that they intend to run the IMA, which is an unintended consequence of the capital shock. The still unreleased floor value for the ratio of standard to internal capital will have a major impact on the thinking of smaller banks. Impact analyses that we have undertaken show an IMA to SBA ratio at typically between 0.4 and 0.55. If the floor is too high, banks will not bother with the IMA.
Nick Haining: The attractiveness of SA or IMA to a firm depends on its portfolio composition; however, generally the advantage of IMA may not be as high under FRTB as the advantage of IMM was historically, because SA is a highly risk-sensitive method and, as such, does not involve crude overestimation of capital. Also, with the SA-based floor to IMA, the final calibration of the floor level will also influence the attractiveness of IMA to the banks.
David Kelly: The SA is a perfectly reasonable attempt by the Basel Committee to provide a robust and conservative view of aggregate risk across all asset classes. Unlike Basel 2.5, the SA for FRTB is viewed by supervisors as an adequate model – this is important as there is now no stigma in staying with an SBA. For banks that run a focused set of largely flow products, moving from an IMA under Basel 2.5 to an SBA for FRTB should be considered as a pragmatic alternative to a large change programme, though banks should have long-term IMA ambition for their key desks.
What might a future-state FRTB IT ecosystem look like?
Steve O’Hanlon: Firms are focused on getting to a lower cost point, as banks with next-generation technology platforms will be a differentiator and open new market opportunities.
We envision a technology platform – such as Numerix Oneview – that can transcend the front office and middle office with a single database, that can handle XVA risk in real time and also be next-generation in terms of what is needed for market risk in the middle office.
As traders and heads of desk still require a choice of validated models and analytics to cover trader conviction, house exposure standards and legacy corporate P&L measurement, we view front office first and as a gateway to firm-wide transformational activities. There is also a shift in the front office towards operating from an enterprise exposure perspective versus at the desk or book level.
The first set of changes in this area was XVA, which Numerix pioneered and brought to the market. These XVAs have evolved to capture market risk, as well as capital and margin. Going forward, we see the role of integrated analytics for trading, risk, finance, research and operations providing firms with a steady evolution towards cross-silo and cross-functional risk infrastructures.
And any solution must be flexible and robust enough to adapt – not to the regulatory requirements of today, but the next round of changes.
When will FRTB be transposed into national law, and how long do you expect its implementation to take?
Nick Haining: The technical guidance from country supervisors, irrespective of whether it is issued as a regulation or a national law, typically follows within a year of the final version of the BCBS document. If the pattern continues with FRTB, the technical guidance will be issued well in advance of the current implementation deadline. This being said, country supervisors have frequently delayed implementation deadlines for new regulatory capital frameworks. Given the complexity of FRTB and FRTB-CVA, it may well happen with the new regulations.
Martijn Groot: The full Basel timetable stretches to the end of 2019, and not all major jurisdictions have confirmed these timelines. Implementation schedules will depend on whether a bank goes for the IMA, and how heterogeneous the current risk infrastructure is.
FRTB shares certain data management requirements with other regulations, including: the need for additional tags on data; regulatory risk factor classification; the need for real prices in valuation and generally casting a wide net when sourcing market data; and documenting and tracking the use of proxy rules more clearly. The bottom line is that regulators have no tolerance for ‘sticky tape’ solutions and one of the most evident requirements is joined-up data. Sourcing clean market data continues to be a key challenge for risk calculations – and it is a waste of valuable quant time to spend it on data formatting and cleaning.
A market data hub that centrally sources, maps and assesses market data – and services needs in finance, risk and collateral – speeds up the process. More importantly, it will secure consistency between front-office and risk data.
Lars Popken: The Basel Committee suggests that national supervisors finalise transposition into national law by January 2019, with banks formally going live by the end of the year.
Realistically, implementation within the banks will take time. FRTB doesn’t just require a change to methodologies but comprises a front-to-back transformation of banks’ systems. For example, the FRTB test for internal model eligibility implies that market data is fully aligned between the front office and risk systems, but the exact nature of the eligibility test is still the subject of debate between regulators and the industry.
Although the final FRTB standard text was published in January 2016, there are a number of components beyond the eligibility test that still require clarification or interpretation prior to implementation. This may have implications on timelines – areas of uncertainty are often tackled last, especially when they are as intrusive and costly as the internal model eligibility test P&L attribution.
Given this, it would be regrettable if the transposition process was a mere copy of the BCBS rules-set. It would be much more productive if national regulators work with the industry to refine the rules-set, and achieve consensus on as-yet undefined areas. Such a co-operative process would remediate many remaining concerns around FRTB.
Similarly, when a desirable framework requires thorough implementation of components well beyond 2019, such components could be phased in after formal go-live. Again, the internal model eligibility test P&L attribution may be a case at hand, where initial monitoring on a ‘light’ version of the test could be a more appropriate approach for 2019 until it becomes a hard criterion. This would allow banks sufficient time to implement robust processes for meeting the criteria – or regulators to better understand where the test is not appropriate despite its compelling theoretical justification. After all, its appropriateness has never been demonstrated in earnest.
Read/download the article in PDF format
Read more articles from the FRTB special report
Sponsored content
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@risk.net
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@risk.net