This article was paid for by a contributing third party.More Information.
Solving the data challenge: technical solutions for optimisation of risk management, capital and liquidity resources
Since the financial crisis that began in 2007–08, regulatory pressure on requirements around capital adequacy, liquidity, funding, balance sheet size and leverage has become increasingly intense. As a consequence, financial institutions need to manage their scarce financial resources ever more wisely, explains Opensee’s Emmanuel Danzin
For capital markets traders and risk managers, this means having an acute understanding of risk and how resources are used so business opportunities are optimised under a strong set of constraints, and comply with risk limits and overall strategy. For treasury functions, this requires having a grasp of future cashflow projections under multiple scenarios and angles, and managing liquidity risk with minimal error margin and cost. In both cases, desks need to handle significant datasets by running non-linear aggregations and, equally, drilling down to the most granular level – all of which need to be conducted at speed and with autonomy for business users.
Over the past few years, technologies have been employed to either leverage the fastest hardware, such as in-memory (RAM) uploads or graphics processing unit-based full revaluations, or to use smart shortcuts, such as pre-aggregations or machine learning to analyse historical data use and pre-index datasets accordingly. Significant challenges remain, however, to being able to quickly and consistently provide the full wealth of datasets to business users without excessive infrastructure costs.
The problem with non-linearity
Essentially, banks are having to grapple with massive amounts of data because the aggregations that need to be calculated are non-linear. Marginal impacts of changes – for example, new trades or changes in projected cashflows – require complex reaggregation with the rest of a portfolio as different netting methodologies are applied.
In the capital markets space, risks and resources used are monitored through marked-to-future projections, risk-weighted assets, sensitivities and cross-valuation adjustments accounting for credit exposure, funding impact and capital use, among others, as well as an array of risk metrics. To determine the footprint of transactions and what use is made of the resources deemed scarce, ‘what-if’ simulations are employed to assess those marginal impacts through non-linear aggregations. These can be performed either before or after the transaction has been traded, as follows:
Pre-trade calculations
Pre-trade calculations measure the incremental impacts of new trades for traders to assess their relevance to the desk in terms of strategy and risk. Here, speed is key as traders need to be able to quickly calculate these marginal impacts on all resources to decide whether or not they should do the trade and with what economics, while minimising error margins that are a source of expensive buffers.
Post-trade impacts
Post-trade impacts also need to be measured and assessed with potential mitigation actions in mind, such as portfolio reshuffling, clearing, compressions, trade restructuring and synthetic offloads. Speed is less an issue in these cases, yet the multiple scenarios that need to be considered translate into massive volumes of data.
Asset-liability management (ALM)/treasury
In the ALM/treasury space, aggregations are likewise non-linear and diverse as they require compliance with a wide array of netting rules, such as accounting standards – International Financial Reporting Standards and US Generally Accepted Accounting Principles, for example – leverage ratio and tax, as well as any additional constraints that apply to global systemically important banks. These calculations – for example, of liquidity metrics such as the net stable funding ratio and the liquidity coverage ratio – need to be run on the fly for multiple cashflow projections scenarios, each with granular data to retain the full wealth of the dataset. Ideally, treasurers want to manage liquidity with precision to cover various scenarios accurately and with lower hedging costs. With a granular and rich dataset, future liquidity positions can be simulated, and then aggregated and investigated with enhanced accuracy. Access to a full array of datasets can also open up a wealth of new possibilities – for example, the application of machine learning on a historical basis to better predict future behaviours.
Speed and autonomy
To visualise, navigate and efficiently report on the data, user agility is a prerequisite. Ultimately, both speed and autonomy allow business users to better understand data and focus on the most salient data points. Big data analytics solutions such as Opensee have been designed to enable the optimisation of resources with speed and user autonomy. What is new is that these operations can now be performed without compromising on volumes, meaning the full granularity of the dataset can be maintained. At the root of such a breakthrough is the new capacity to benefit from the horizontal scalability of disks, providing virtually unlimited capacity on a low-cost infrastructure, all while maintaining RAM-like speeds.
When it comes to optimising resources, rather than attempting to optimise different sources of trade data in a piecemeal fashion – thereby running the risk of increasing the usage of a key resource when attempting to decrease another – multiple datasets can now be centralised into one, or central datasets can be further enriched by adding a secondary set. This centralisation of data around risk metrics, for example, profit-and-loss information, balance sheet consumption and collateral usage, allows business users to aggregate, manipulate, analyse, simulate and visualise trade data with a simultaneous, comprehensive view of the impacts on all dimensions. This means the entire ‘utility function’ can be optimised, rather than just one metric at a time. Such enhanced data capacity truly represents a groundbreaking shift that ultimately allows banks to optimise their resources on a much more efficient basis.
Fine-tuning resources with speed and agility
Confronted with stringent regulatory constraints and a challenging market environment, banks are having to adjust by leveraging new technologies and solutions to allocate their resources in an optimal way, in running multidimensional scenarios on their full granular datasets. The era of running optimisation scenarios on a manual and intuitive basis is coming to an end. Rather, financial institutions embracing innovative big data solutions are finally able to fine-tune their resources with speed and agility to their advantage.
Sponsored content
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@risk.net
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@risk.net