Why not having AAD needn’t be the end of the world
Optimisation method offers quicker and more focused way of making XVA calculations
It's been more than half a decade since the start of the adjoint algorithmic differentiation (AAD) revolution. Previously, major banks had been engaged in something of an arms race, with some investing enormous sums in an effort to gain mightier processing power. But AAD has rendered much of that power redundant. The mathematical technique is capable of speeding up the calculation of risk sensitivities, or Greeks, by up to a thousand times compared with traditional methods, and is now being used by firms for a host of different applications.
Traditionally, risk sensitivities are obtained by adjusting or 'bumping' inputs one-by-one and valuing derivatives prices over and over again for every input and every trade. This adds up to a large number of calculations, which sometimes might take days to run. The process gets even more complicated when the risk sensitivities to be calculated are those of derivatives valuation adjustments (XVAs).
In contrast, AAD works out the sensitivities using a single pricing calculation, exploiting the chain rule of differentiation that allows for the simultaneous calculation of sensitivities. The result is an almost real-time calculation of risk. The downside is that the method is tough for many to get their heads around, and banks have to rewire their IT architecture around each application to be able to run it. This puts banks that haven't yet implemented AAD at a massive disadvantage.
However, it looks like there is now a workaround.
In a paper published recently on Risk.net, titled Risk optimisation: the noise is the signal, Benedict Burnett, Tom Hulme and Simon O'Callaghan – all of whom work at Barclays in London – propose a technique for optimising risk calculations without AAD. In it, the authors show that speeds similar to AAD can be achieved by identifying risks that aren't material and spending less time on them, using the example of XVA risk calculations.
Typical XVA books have thousands of counterparties, but not all of them contribute in the same way towards each risk, with some being riskier than others. If banks were to run simulations of the same specification for all counterparties, a lot of computational effort would be wasted on counterparties that are not particularly risky. "We fell into that trap and probably other banks did that as well, imposing a structure at the start rather than letting the numbers speak for themselves," says O'Callaghan.
In order to avoid this, the authors run so-called "lightweight" simulations, with fewer paths for each counterparty, as a preliminary step to see how significantly they affect overall error in estimation. From this, they figure out the number of simulation paths and the frequency of time points to be used in running the simulations for each counterparty. These two factors affect how computationally intensive the calculations will be, so changing them allows the user to calculate risks with the speed and level of precision that traders require.
The basic idea is to focus on keeping the time spent on each risk proportionate to its error, minimising the overall error for the whole portfolio for a given period of time. "If I have 10 minutes to get the result, it can be much more accurate, but if I need it in a few minutes or 20 seconds, I would still be able to choose the right target amount of time," says Hulme. "It is up to the users. When there has been a very large move – like Brexit, for example – and traders need to quickly rerun the batch intraday on a less accurate but faster run, they can do that by changing a single parameter."
By applying a lower number of paths and time-steps for less material counterparties, the risks of the overall XVA book can be calculated much faster. More precise calculations can be carried out for the riskier profiles by running a greater number of paths and time-steps. Because of this flexibility, the Greeks can be obtained one to two orders of magnitude faster than it would take to run a complete simulation, the authors say.
What this means is that not all banks have to rely on the magic of AAD to quickly calculate their risks.
"Everyone would want to implement AAD since it would increase precision and stability with several orders of magnitude in performance improvements; however, the cost of doing that is just too expensive," says Francois Bergeaud, the head of XVA quantitative analytics at Royal Bank of Scotland in London. "It is the fastest thing you can think of, but if you don't want to invest in a whole team of quants – say, 10 people – for three years working on refactoring the library, then this approach is a good practical compromise."
As the role of quants becomes increasingly driven by technology and optimisation, some tricks of the trade will survive and others will die out, in a tough evolutionary battle accelerated by rising costs and regulatory changes. AAD is not likely to be made extinct any time soon, but the optimisation proposed by the authors appears to give banks with fewer resources an option that puts them on a level playing field. That's all the more important at a time when risk management and capital savings come with a huge price tag attached.
Also out this month: Operational risk modelled analytically II: classification invariance, by Vivien Brunel
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe
You are currently unable to print this content. Please contact info@risk.net to find out more.
You are currently unable to copy this content. Please contact info@risk.net to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@risk.net
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@risk.net
More on Risk management
BofA sets its sights on US synthetic risk transfer market
New trading initiative has already notched at least three transactions
Op risk data: At Trafigura, a $1 billion miss in Mongolia
Also: Insurance cartels, Santander settlement and TSB’s “woeful” customer treatment. Data by ORX News
Cyber risk can be modelled like credit risk, says Richmond Fed
US supervisors may begin to use historical datasets to assess risk at banks and system-wide
The changing shape of risk
S&P Global Market Intelligence’s head of credit and risk solutions reveals how firms are adjusting their strategies and capabilities to embrace a more holistic view of risk
To liquidity and beyond: new funding strategies for UK pensions and insurance
Prompted by policy shifts and macro events, pension funds and insurance firms are seeking alternative solutions around funding and liquidity
More cleared repo sponsors join Eurex ahead of cross-margining
End of TLTROs for banks and pension fund search for liquidity management tools drives uptake
Reimagining model risk management: new tools and approaches for a new era
A collaborative report by Chartis and Evalueserve on how the use of automation can combat the growing complexity of managing model risk due to regulation and market volatility
What Goldman’s appeal victory means for Fed stress tests
Decision could embolden more banks to appeal, analysts say. But others believe result is one-off