XVA reaches far and wide

XVA reaches far and wide

The Panel

  • Nick Haining, Chief Operating Officer, CompatibL
  • Dennis Sadak, Senior Vice-President, Numerix
  • Marwan Tabet, Head of Enterprise Risk Management Practice, Murex

Practices range widely across the family of valuation adjustments – collectively known as XVAs – which are typically calculated by taking the expected positive exposures of a derivative at future points in time and then applying the relevant costs to that exposure.In a forum sponsored by CompatibL, Murex and Numerix, a panel of market practitioners examine some of the key issues, including the lack of standardisation and consistency, and how technological developments look to address some of them

What are the greatest challenges facing the industry in the world of XVA?

Nick Haining, CompatibL
Nick Haining, CompatibL

Nick Haining, CompatibL: The greatest risk in the world of XVA right now is the lack of standardisation in how valuation adjustments – such as funding valuation adjustment (FVA), margin valuation adjustment (MVA) and capital valuation adjustment (KVA) – are computed, reported and charged to clients. In a recent paper, Leif Andersen, Michael Pykhtin and Alexander Sokol argue that even the plain credit valuation adjustment (CVA) is not always properly evaluated because of previously unrecognised margin settlement risk – a view supported in recent Risk.net editorials by Nazneen Sherif. When a variety of disparate ways of computing valuation adjustments are combined with limited disclosure of calculation methodology options for financial reporting purposes, it is difficult to assess the significance of the reported XVA figures in bank financials. The lack of a standard approach in charging these valuation adjustments to clients is also problematic as it may cause firms that fail to properly calculate one of the XVAs to suffer from a ‘winner’s curse’ in taking trades with major hidden costs they fail to appreciate.

Marwan Tabet, Murex: The industry is facing several challenges in the XVA domain. Banks recognise the need to move to the next stage by industrialising the process of pricing and managing XVA, but many lack the technology and the right organisation.

While most trading desks have implemented CVA and FVA solutions, they have often relied on tactical solutions, focusing primarily on their pricing engines. They are now facing significant limitations in terms of scaling and integrating new and complex valuation adjustments, as well as implementing central desk capabilities.

The inability to scale is one of the major impediments to streamlining the XVA process across an organisation. Banks need to include existing and nascent XVAs in their entire derivatives portfolio, while sales desks require real-time pricing of XVA for any new transaction or trade amendment. Central desks need to calculate a constantly growing number of first- and second-order sensitivities, in addition to stress testing and running XVA profit-and-loss (P&L) attribution. Banks looking to deploy such capabilities are facing major challenges, which increasingly require new architecture and a redesign of their software solutions landscape.

Setting up central desks is another challenge facing the industry. The portfolio nature of XVA, encompassing a wide range of ‘cross-desk’ risks – such as credit, capital, funding and collateral – has led to the emergence of XVA desks, along with the introduction of new business and decision-making processes within capital markets organisations. The set-up of central desks poses substantial operational and technological challenges, as it requires the integration of processes and systems across the entire trading value chain.

Dennis Sadak, Numerix: Drawing from our own experience, the greatest challenge facing the industry right now related to XVA is the constantly evolving landscape. As regulations are constantly changing, adjustments like KVA have to be updated and revised according to those regulatory changes. For example, when the Fundamental Review of the Trading Book (FRTB) was introduced, there was a rush to understand it so XVA measures could properly reflect the new regulation. 

Regulations forcing industry participants to account for the cost of clearing and post increased amounts of collateral have also given rise to MVA. So as the regulatory landscape is constantly changing, practitioners responsible for XVA must be ready to adjust with it. 

Another challenge is the compute aspect of XVA. In the front office, real-time pre-trade XVA measures are extremely complex. They require an enormous amount of compute power to handle the data being generated. 

So we’re dealing with business challenges as a result of the regulatory landscape, and technological challenges where speed and data management are the top concerns. 

Banks have so far been unwilling to price MVA into non-cleared trades since the initial margin (IM) rules were introduced, as no single approach has won consensus support. Are there signs of this changing?

Marwan Tabet: Pressure to include MVA in pricing is likely to grow in the coming months as we draw closer to phase III of the IM regime, which is currently scheduled for September 2018. At this time, only a minority of banks – which aggregate notional amounts exceeding the threshold of $2.52 trillion – are posting IM on their non-cleared over-the-counter (OTC) derivatives; and many banks have optimised their portfolios via compression and other means to reduce their notional amounts. These banks are likely to be eligible during the next phase.

While all agree that both cleared and non-cleared IM need to be accounted for when measuring MVA, there are no best practices for modelling the expectation of IM over the lifetime of transactions. For example, the model may need to forecast standard initial margin model (Simm) for non-cleared derivatives and value-at-risk (VAR) for centrally cleared derivatives. Several methods exist, but most of them have either performance or precision limitations. Therefore, implementing MVA is mostly about finding the right trade-off that allows the integration of MVA in real-time pricing by adopting the right approximations without compromising accuracy. In any case, several methods can co-exist and, depending on the actual usage, banks can use the method that best matches their particular needs or portfolio mix. This will certainly be the case when banks are calculating sensitivities or performing stress testing on MVA. Having the flexibility and the right architecture to achieve this is going to be fundamental.

Nick Haining: Because MVA represents a major cost compared with other valuation adjustments, we expect the industry to quickly converge around a standard way of pricing MVA – at least on a standalone basis. Accurate pricing of incremental MVA may take longer, as dealers may be initially unwilling to pass on cost savings from MVA offset on risk-reducing new trades to their clients.

Which valuation adjustments are set to cause the most headaches for the industry, and what specific regulation is likely to introduce new challenges for pricing XVA?

Dennis Sadak: Overall, uncertainty in regulations can leave a bigquestion mark over what methodology to use to calculate some of these valuation adjustments. 

Dennis Sadak Senior Vice-President, Numerix
Dennis Sadak, Numerix

Practically every valuation adjustment requires the simulation of different risk measures into the future, spanning the entire lifespan of those trades under consideration. Consequently, computational complexity is immense. Many, for example, are finding MVA to be a challenge. The reasoning is that, for MVA, the risk measure to be simulated into the future is a VAR-like measure. So, in other words, daily portfolio VAR must be simulated into the future for the entire lifespan of the trades. This is the case for trades cleared through central counterparties such as LCH or Eurex, as well as for OTC transactions that will be governed by the International Swaps and Derivatives Association’s (Isda’s) Simm.

Marwan Tabet: KVA is likely to pose serious challenges to the industry. The rules for calculating capital are quite complex and can be computationally intensive – in particular, when banks are using internal models. Integrating these calculations in a Monte Carlo engine requires a non-trivial mix of optimisation and approximation. In addition, a KVA model may need to account for future changes in regulations over the lifetime of the transaction. In that context, XVA frameworks need to accommodate a regulatory backdrop that is expected to continue in a state of flux over the next few years. Thus, the ability to manage configurability and variability of regulations is essential.

Nick Haining: This year’s prize for the most problematic valuation adjustment, as well as for the valuation adjustment most affected by the ongoing changes in regulation, goes to KVA. Not only does KVA depend on multiple regulatory capital methods – each contributing its share of the capital and each with its own calculation challenges – but the regulations also change over time. This leads to the possibility of calculation of KVA to portfolio maturity based on today’s regulations proving inaccurate because they may change in ways we cannot anticipate. On top of that, recent publications by Duffie et al challenge the established way of calculating KVA based on the hurdle rate (expected return on capital) and propose a new approach based on the analysis of the entire balance sheet of the bank and the concept of ‘debt overhang’.

Are there new valuation adjustments on the horizon, or has the industry reached its limit?

Dennis Sadak: The industry hasn’t quite reached its limits. In fact, a new industry term – additional valuation adjustment (AVA) – has been introduced as part of prudent valuations, a regulation that has emerged somewhere between pricing and risk management. AVA categories can vary widely and take into account such things as operational and administrative costs. 

Banks must consider the total cost of ownership of their pricing to understand the impact on their business and profitability. With a clear picture of operations – everything pre-deal and post-trade – banks will have a clear path to allocating these costs back to the individual trade level. Only at this point will they be able to determine if that particular area of business or trade type is profitable enough to stay operational, or if it should be automated or even shut down. 

Nick Haining: As the industry endeavours to model the costs of derivatives trading with ever-increasing precision, it is almost certain that other derivatives trading costs will be reflected in new valuation adjustments. We also see a trend toward splintering well-known valuation adjustments into multiple variations, depending on the assumptions used in their calculation and the way the total valuation adjustment is split into its component parts.

Is it becoming easier for banks to price in XVA as technology improves, and rules affecting them become more widely enforced?

Nick Haining: Definitely. As XVA calculation methodology becomes more standardised and more widely available, most financial institutions are able to compute XVA properly via Monte Carlo calculation, instead of relying on crude approximations or XVA calculations provided by their dealers. This trend leads to lower trading costs as more market participants are able to challenge XVA numbers provided by their counterparties and demand fair and accurate valuation of the XVAs charged to them.

Dennis Sadak: As technology improves, XVA will become more cost-effective to compute. 

For example, new quantitative methods – such as new approaches to algorithmic differentiation to calculate XVA sensitivities – will help to reduce computational expenses. Less arduous computational methods can reduce the IT cost of XVA – while not easier, they will perhaps be less expensive.

Also – as seen with Isda and the introduction of standardised credit support annexes (CSAs) – by removing collateral optionality from CSAs, the complexity of calculating these measures is also reduced.

As technology improves, speed and performance will also be impacted. From our perspective, performance of XVA solutions have as much to do with the speed of the calculations as it does with the way these calculations are linked together in a real-time framework. For example, with trades and analytics that are up-and-running leveraging new graph technologies, intraday changes are feeding in on a real-time, event-driven basis – so new trades, market data, CSAs and counterparties are effecting changes quickly and efficiently, focusing on the minimal recompute path. We see this as a very powerful technology.

How great a concern is figuring out how to price XVA for options compared with swaps?

Nick Haining: At this stage, accurate calculation of XVA – not only for swaps but also for more complex trade types such as options, callables and barriers – is routine and widely available within internal and vendor XVA solutions. Only after every trade in the portfolio is modelled properly on a standalone basis can the calculation accurately capture the portfolio-level effect of netting, CSA and IM on XVA.

How are new tech solutions evolving to help price XVA?

Marwan Tabet, Head of Enterprise Risk Management Practice, Murex
Marwan Tabet, Murex

Marwan Tabet: Recent evolutions in technology have significantly helped to develop new capabilities around XVA. Graphics processing units (GPUs) are shifting into the mainstream as they become critical for a variety of domains, such as deep learning. At Murex, we started using GPUs more than a decade ago for pricing complex derivatives. This experience was instrumental in designing an architecture that leverages GPUs for delivering advanced XVA functionalities, including sensitivities calculation, CVA attribution and CVA stress testing.

Cloud computing is another area that brings major benefits to banks. For XVA, banks will typically have a steady load on their systems throughout the day, and will run huge numbers of intensive calculations at specific times over that time. Additional calculations may also be needed: for example, for end-of-month reporting leading to even higher resource usage peaks. Cloud services can solve this problem by offering rapid elasticity at an optimised cost. However, for a cloud solution to be truly beneficial, it must satisfy two key criteria: first, to leverage the cloud technologies for scalability across the entire calculation chain; and second, to be fully integrated into the bank’s processes across the entire trading value chain. We are committed to achieving both.  

Dennis Sadak: New technology solutions are evolving in many new and interesting ways for XVA

Because of constant progress in fields such as gaming, this market is adopting technologies such as GPUs, which are changing the way hardware is designed to specifically support operations that could in turn be used for XVA calculations. Though not related to XVA or risk calculation today, Google has also introduced its first tensor processing unit, which multiplies matrices and could be applied to artificial intelligence and machine learning. In the future there could be specific application in this area of finance. 

In terms of software, market-standard domain scripting languages such as Python and open-source offerings provide interfaces and allow end-users to interact with analytics in very specific, bespoke ways. This has been central to building out light, flexible front-end environments for XVA, where users can rapidly incorporate new features and address complex requirements. 

The world of XVA is evolving not only because of advances in hardware, but also software technologies that are helping to manage the complexity of the data issue imposed by XVA. Technologies such as Cassandra, MongoDB and Hadoop are helping to price XVA as well as evolve banks’ entire IT infrastructures, making it possible to manage incredible amounts of unstructured data. 

Today, solutions must be flexible and robust enough to adapt. Therefore, cloud solutions have also become mainstream deployment strategies, especially for managing the compute needs of XVA and other risk management tasks. Whether a private, public, hybrid or managed service, a cloud computing infrastructure can help to produce pricing and risk reports for even the largest and most complex derivatives portfolios.

Nick Haining: The most important recent technology development in XVA is the introduction of adjoint algorithmic differentiation (AAD) as a method of calculating XVA sensitivities. The task of computing these sensitivities is a perfect match for the performance characteristics of AAD, because AAD works best when a large number of sensitivities must be computed. Because XVA depends on tens or hundreds of curves, each XVA figure has hundreds and sometimes thousands of bucket sensitivities in total. With AAD, all of these sensitivities can be calculated at the computational effort of around five times the effort required to compute the XVA number once, irrespective of the number of sensitivities. For bucket XVA sensitivities, this leads to around two orders of magnitude acceleration compared with the standard bump and re-price approach.

Since the variation margin rules came into force on March 1, has ground been made on a standardised approach to pricing MVA?

Nick Haining: At this time, the core mathematical principles of computing MVA are clear for both the schedule-based approach and the risk-sensitive Simm. However, effective numerical techniques for performing this calculation in practice at a reasonable computational effort are still being developed. In a presentation at the sixth annual WBS Initial Margin & XVA conference, Alexander Sokol, chief executive officer and head of quant research at CompatibL, proposed a fast and accurate method of computing MVA without crude approximations, using AAD.

 

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here