XVA management – Challenges and solutions

XVA management – Challenges and solutions

Amid a lack of established best practice on how to manage and calculate XVA, for many firms standardisation is priority. In our XVA management forum, a panel of industry leaders discusses key topics, including the effect a changing regulatory landscape is having on XVA management, the potential impact of cloud computing and web-based technology, and industry‑wide limitations with XVA calculation and how the panellists’ respective organisations are addressing them

The panel

  • Stéphane Rio, Founder and Chief Executive, ICA
  • Marwan Tabet, Head of Enterprise Risk Management, Murex
  • Martin Engblom, Co-Chief Executive, triCalculate
  • Satyam Kancharla, Chief Strategy Officer and Senior Vice‑President, Client Solutions Group, Numerix
  • Antonina Harden, Senior Quantitative Specialist, OTC Derivatives, NFA

What are the biggest concerns currently about how the industry manages XVA?

Stéphane Rio – ICA
Stéphane Rio, ICA

Stéphane Rio, ICA: Beyond the lack of standardisation and transparency, which are key issues for the industry, each bank has also its own challenges.

Typically, there are two areas that are often contradictory, but always intertwined, in which the current situation seems unsatisfactory for XVA desks: speed – or efficiency – and acute understanding.

A typical illustration of concerns with speed is traders having to assess – in as close to real time as possible, which more often ends up being done in minutes rather than in a couple of seconds – the impact of a transaction on XVA. Add in the resulting changes to hedges and the entire process often becomes a daunting task.

In terms of understanding, whether profit and loss (P&L) explanations or sales attributions, determining the source of a change is similarly difficult.

Often, to resolve the first problem, solutions have compromised the second, and vice versa.

Satyam Kancharla, Numerix: Of paramount concern is a lack of standardisation and sparse employment of best practice in the calculation of XVAs and the pricing of trades during a transaction lifecycle. There is diversity in the ways XVAs are computed and how businesses approach their management. Having a very broad diversity of market practices can be a hindrance to market liquidity and the transparency of over-the-counter (OTC) derivatives markets.

Martin Engblom, triCalculate: The biggest concern in our view is inconsistency in approaches to XVA – which XVA should be charged, and in which situations? The emergence of margin valuation adjustment (MVA) over the past 12 months has brought this to light again after a period of approaches that are more of a market standard for credit valuation adjustment (CVA), debit valuation adjustment (DVA) and funding valuation adjustment (FVA). 

XVA infrastructure also remains an issue. Large banks are still using in‑house builds that struggle with the capacity requirements of a modern XVA desk. Mid-tier banks are using legacy software solutions that are difficult to upgrade. Smaller banks and large corporates are relying on approximations and spreadsheet approaches that introduce inaccuracies and operational risk.

 

How difficult is it to make centralised XVA decisions with siloed trading desks?

Marwan Tabet – Murex
Marwan Tabet, Murex

Marwan Tabet, Murex: The disruptive pace of change – largely driven by regulation – is substantially impacting the total cost of trading. Therefore, to ensure sustainable profitability over the medium term, there is a strong business incentive to incorporate the right cost drivers through XVA.

Effective XVA decision-making requires the management of a wide range of ‘cross-desk’ risks such as credit, capital, funding and collateral. A siloed trading desk organisational approach, often combined with a fragmented systems architecture, makes this task extremely challenging.

A central desk has the advantage of pooling the significant expertise required across each of the disciplines, which introduces cohesion by design. It facilitates the implementation of consistent pricing, charging, hedging and P&L management operations across desks.

The implementation of a central desk is therefore crucial for effective XVA decision-making and active management. However, successful implementation requires complete and coherent integration with trading business processes across desks, of which technology is a key enabler. A central XVA solution must combine rich business content and high performance, and it must deliver strong integration capabilities able to handle trading needs such as real‑time pricing, simulation and what‑if analysis.

Stéphane Rio: XVA desks have the unique feature of being true cross‑asset desks. In particular they are, by nature, forced to rely on data – trades, counterparty and market data – coming from all other desks in the bank. This raises questions around the heterogeneity and quality of this data, which will drive important decisions by the XVA desk. Appropriate processes and controls have to be put in place to mitigate this risk.

A second aspect is organisation. When it comes to pricing a client trade, there are several parties involved in building the final price, including the trading desk (risk‑free price), the XVA desk (XVA margin) and the sales desk (sales margin). A robust XVA system must account for an efficient sales-pricing workflow, including for all actors’ interactions, and ensure a timely response to the client.

Martin Engblom: Centralised XVA desks are now commonplace. However, challenges persist, including the need to provide transparency to trading desks around charges that impact them. Ensuring consistent calculations across an institution is part of the challenge, and global access to calculations can bring transparency to how an XVA desk charges its internal clients.

Satyam Kancharla: Correctly managing XVAs on a siloed basis is next to impossible, which is the reason for XVA desks. It is important to have a unified framework for collecting data and running calculations. All XVA-related activities should be brought together on a consistent basis so a business can implement XVAs correctly and take action on different tasks, such as pricing, hedging, risk management, accounting and capital management. This requires more complex interrelationships between different business areas – something that didn’t previously exist.

 

How significant a concern is needing to make hard and fast decisions on XVA in a changing regulatory landscape?

Martin Engblom – Nex triOptima
Martin Engblom, Nex triOptima

Martin Engblom: An example of this concern exists in capital valuation adjustment (KVA). It is difficult to project margin and capital into the future when you don’t know if or when regulation – and therefore margin and capital requirements – will change. It should be noted, however, that an accurate KVA calculation represents the expected value of future quantities, with uncertainty and associated probabilities therefore built into the calculation. It is important that users of KVA numbers in particular are aware of these uncertainties and have the flexibility to adapt their calculations for a range of different assumptions.

Satyam Kancharla: We haven’t experienced a static set of regulations and we’ve seen how they can and do change. That is why it is important to consider regulatory change within an XVA framework. If a market participant is able to do that, then decisions can be made based on the current and evolving regulatory environment. 

XVA calculations require projecting several years into the future. Amid an environment of uncertainty around how regulations will look in the future, regulatory assumptions made today cannot be considered reliably accurate for the future. However, we need frameworks that allow such change to be modelled because regulatory change is not going to go away – especially if you examine long-term horizons, as in XVA calculation and management.

Stéphane Rio: Such a landscape often translates into new and potentially complex system requirements. However, in practice, system evolutions are very slow, because of system landscapes in banks that are still quite monolithic. In the case of traditional system vendors, upgrading to attain new functionalities is often a large project that can take months, if not years. 

For banks undertaking internal development, there is a need to separate tasks into independent modules to gain agility and adapt more quickly to the required evolution. For banks using vendors, it is time to think of adopting Software as a Service (SaaS) – adjusting to regulatory changes will only be a matter of testing the new results or connecting the results to the internal workflow, and can be achieved in just a few weeks at minimal cost.

 

Could a series of deregulations render investment in XVA calculations pointless?

Satyam Kancharla – Numerix
Satyam Kancharla, Numerix

Satyam Kancharla: Absolutely not. XVAs are tied to many factors, such as risk, clearing, collateral and margining – not just regulations. XVAs are a market shift that took on a degree of acceptance and are here to stay. A good analogy is the market crash in 1987, when markets shifted to create pronounced volatility skews. Some market participants adapted quickly and others did not. Those who did not lost money and very quickly learned their lesson.

Stéphane Rio: I don’t think one should take bets on whether there will be more or less regulation, as it is ever-changing. But whether because of changing regulation or deregulation, banks should be agile in their development processes and vendors’ upgrades.

Martin Engblom: Regulations will not make XVA calculations pointless, but they will cause focus to shift between XVA numbers. The new margin rules are, for example, shifting focus from CVA and DVA to MVA. In this case, XVA will have less impact on the balance sheet as CVA and DVA decrease, but it will have an increase on the cost of trading through the initial margin (IM) funding requirements. The power of XVA calculations is that they provide an effective tool for measuring trading costs. Regardless of increased or decreased regulation, measuring these costs will remain a fundamental part of an optimal trading franchise.

 

What are the current limitations for the industry when calculating XVA?

Satyam Kancharla: I think there is a range of limitations, such as data, technology and the availability of the right tools and frameworks to define XVAs, but mostly the issue is a general lack of understanding and standards of how to address each XVA situation. XVAs break the law of one price, so every situation will be different, be it for pricing, hedging, risk management, accounting, finance or capital management.

Marwan Tabet: Predictably, the compute challenge of XVA has already surpassed the limitations of the previous generation of risk systems infrastructure. No sooner had systems come to terms with the compute demands of CVA and FVA than MVA arrived, demanding compute power an order of magnitude higher due to the requirement of modelling IM – cleared and bilateral – for every scenario in the Monte Carlo model. This has required new innovations in underlying risk technologies, including graphics processing units, adjoint algorithmic differentiation, graph programming and cloud computing.

Modelling challenges include observability and availability of various data inputs required for calibration, which is a particular problem in less developed and smaller markets where there is low liquidity.

A common limitation today is the time lag in robust and dependable XVA numbers becoming available. Using approximations for intraday XVA pricing can result in inaccurate hedges or even mispriced trades, which are expensive to correct. Moving from a batch end-of-day calculation framework to integrating XVA pricing into intraday trading decisions demands more from a firm’s risk systems layer. Significant investment is required for this, whether through in-house systems or vendor solutions, but it is the key transformation enabler that puts XVA at the heart of an organisation’s profitable trading decisions.

Stéphane Rio: Big compute and big data. Banks inevitably bump into large parallel compute issues and the capacity to manipulate a very large amount of data.

Often these hurdles are resolved through approximations – thereby avoiding a full revaluation – and through discarding intermediate results, which creates additional problems: 

  • Model validation challenges – in particular when models from the risk department differ from the front-office models.
  • The inability to save intermediate results complicates the analysis of results and requires full recomputing for each incremental pricing.

Furthermore, access to sufficient compute power is critical. However, when using a finite quantity of in-house servers, XVA desks lack compute power for night batches – for in-depth sensitivities, cross‑gamma or stress-test analysis, for example – and must bear the cost of ‘sleeping resources’ for most of the day.

Martin Engblom: Speed and accuracy. Many banks are still using legacy calculation engines that were developed when XVA was not part of managed P&L. The emergence of the XVA desk has highlighted that slower, less accurate calculation engines produce noisy P&L and unstable sensitivities. The quality of those analytics cannot meet the standards traders have come to expect from front-office risk systems. Better analytics engines are now emerging, but replacing an existing risk system is a challenging task involving many stakeholders across multiple functions within a bank.

 

In what way would your organisation address these limitations?

Stéphane Rio: Those limitations are what ICA is focusing on. We don’t believe models are the issue, but rather the implementation of models in the right architecture and infrastructure. In delivering a fully serviced solution, we deal with the full processing chain and have made it our speciality to tackle big compute issues on behalf of banks. 

ICA has embraced all the recent big data and cloud technologies to address those issues and benefit from the following:

  • Combining the business, digital technologies and quant expertise into a single team
  • Leveraging the cloud for all non-confidential calculations, allowing the majority of compute to be fully scalable and elastic
  • Being able to save and manipulate enormous quantities of data in a database, rather than in memory.

This allows us to:

  • Optimise the distribution of calculations (avoid redundant calculations, minimise input/output, and so on)
  • Compute cross‑gamma or stress-test scenarios on demand. More generally, compute what you need rather than what you can or what you did the day before
  • Access all intermediate calculations in real time, facilitating result investigation and generation of what-if scenarios in real time – pre‑trade pricing, post‑trade optimisations, changes of credit support annex terms, central counterparty upload, and so on.

Satyam Kancharla: Standards and best practices have to emerge through the work of industry publications such as Risk, as well as through regulators, technology providers and the like. We have come a long way since the chaos of a few years ago, and it appears we are converging around a core set of standards.

Martin Engblom: We frequently hear from large bank clients that a faster calculation rate gives them access to risk calculations that they would otherwise be unable to process daily. Faster calculations mean they can manage their risks more precisely, support functions can get the full suite of analytics they require and reporting of XVA matches other trading risks. Furthermore, the 100,000 Monte Carlo paths we run as standard ensures the banks benefit from increased accuracy of XVA and sensitivities. However, simply providing fast and accurate XVA calculations isn’t sufficient for today’s market participants. To facilitate the adoption of more cutting-edge analytics tools, user-friendliness and transparency are crucial. Consistent, interactive calculations that are available to all stakeholders across a firm make the switch from a legacy system easier.

 

How influential could cloud computing and web-based technology be in transforming the calculation of XVA and the management of XVA data?

Marwan Tabet: Cloud computing is highly appealing as part of an XVA calculation framework because of the cost and efficiency benefits that derive from its inherent elasticity. XVA compute demands are ‘bursty’ in nature – therefore, being able to invoke compute infrastructure on demand and release
it afterwards is both time- and cost-efficient.

Many early industry concerns surrounding data privacy and security when adopting the cloud have been allayed. Cloud providers have invested massively in this security layer, while data masking and obfuscation techniques of private or commercially sensitive data have become embedded in the data and environment management best practices of most financial institutions.

Nevertheless, a cloud-based solution must also be integrated across an organisation’s various business processes – for example, for sales and trading, central desks, and finance and risk management activities. Therefore, it is fundamental for the cloud solution to deliver ‘integrated’ functionality that satisfies organisations’ various needs, such as pre‑trade analysis, real‑time pricing, P&L attribution, and so on. Otherwise, the benefits of cloud computing will be limited to the cost-efficiency of batch calculations and will not address specific business requirements related to XVA.

Satyam Kancharla: I believe cloud computing is increasingly accepted and is particularly relevant for XVA transformation. XVA calculations are very complex and require immense compute power. Building out server farms to handle the compute challenge can be very expensive, so for many firms it could be more beneficial to transfer the task to the cloud. The cloud also presents a valuable opportunity in empowering banks to meet not only their XVA calculation needs, but also the data aggregation, processing and risk management requirements in an agile and flexible environment, while reducing IT expenses.

Martin Engblom: Cloud computing and web-based technology are already fundamentally changing the risk analytics landscape. 

This is particularly true for XVA, where a web-based approach to calculating and managing these risks is cost- and resource-effective. Web-based solutions are quick and easy to implement because there are no hardware or software installation requirements. They can provide centralisation within an institution and ensure multiple users and business units in a firm have transparency over calculation results – even when these are complex analytics across a wide range of asset classes, data sources and instrument types.

The ability to seamlessly adapt to market changes is important in an increasingly regulatory-driven market environment. Users of a web-based XVA service can stay ahead of regulatory developments quickly without the need for disruptive, time-consuming and expensive new installations or updates. 

While the market adoption of cloud computing and web-based technologies has begun for many firms, some are still hesitant to embrace the change because of apprehensions about data security. triCalculate is unique in that it operates within a private cloud infrastructure, combining the data security of an installed solution with the convenience and scalability of a cloud solution.

Stéphane Rio: Using the elasticity and scalability of the cloud to run massive computations is now largely recognised as a required feature of new-generation XVA solutions. The challenge, however, lies in how secure this data and these processes will be. Regulators have issued sensible guidelines, and additional security can be added to that. For instance, ICA’s process will strip any confidential information – such as notionals, counterparties or netting sets – out of deal descriptions before anything is sent to a public cloud.

Web-based technologies are also key contributors to the reshaping of the landscape for XVA and derivatives pricing and risk calculations. This is a core aspect of ICA’s value proposition; we support banks in implementing those technologies to modernise their architectures. Interestingly, digital innovators have also presented a new way to approach processes and build setups through the flexibility and efficiency of SaaS, which is surely the future of financial software.

 

Which XVAs are causing the most headaches at the moment, and why?

Antonina Harden – NFA
Antonina Harden, NFA

Antonina Harden, NFA: The IM requirements for uncleared swaps have been rolling out globally since 2016 and will continue to do so until 2020. As a result, for the foreseeable future MVA, will continue to attract the industry’s attention. The industry is still searching for a ‘gold standard’ MVA calculation methodology. In addition to converging on the methodology for MVA calculation, the industry is also focused on the impact IM requirements for uncleared swaps will have on the rest of the XVAs. However, given that many more industry participants are expected to be in scope for compliance with the IM requirements for uncleared swaps in 2019 and 2020, the full assessment of the impact on the power game among XVAs is still to be completed.* 

Marwan Tabet: For XVA players already comfortable with CVA and FVA, MVA is a big headache. Apart from the significant additional analytics compute power it entails, there is a lack of consensus within and between firms on how this charge should be passed back up the chain to clients. At the same time, there is a realisation that non-cleared OTC IM costs are significant, or will be once firms fall under their regulatory waves: 26 dealers had posted a total of $74 trillion of collateral to meet regulatory non‑cleared IM obligations according to the most recent International Swaps and Derivatives Association margin survey published at its 2018 AGM.

KVA calculation is affected by the still-changing regulatory capital rules. Some practitioners are assessing whether they can capture, in today’s KVA charge, foreseeable capital charges based on known future implementation dates – for example, the standardised approach to counterparty credit risk.

Stéphane Rio: A typical example of XVA metrics that is highly challenging – or will become so in the near future – is the CVA capital charge under the Fundamental Review of the Trading Book’s standardised approach. Solely for the spot capital, it requires the calculation of the CVA sensitivities across pretty much all risk factors – rates, volatility, credit, and so on.

For full KVA pricing, this process must potentially be repeated for every projected time step. Lastly, trade allocation for that metric will involve even more complex calculations – multidimensional and non‑linear solving – so banks must calibrate compute power accordingly and consider strategies and clarity about allocations of capital by trade, desk and business line.

Martin Engblom: MVA is an immediate problem for banks preparing for IM regulation and for those wanting to keep track of funding costs associated with clearing-house IM. The increasing demand for MVA calculations demonstrates the challenges legacy risk systems face in keeping up with new market requirements. To calculate standard IM model MVA, one must project not only the future present value of all trades in the portfolio, but also a full specification of trade-level sensitivities. This is a much more compute-intensive simulation, which requires sophisticated software and hardware to complete in a reasonable amount of time. It is also clear that a crude approximation for MVA is not effective – often spreadsheet calculations fail to capture the true dynamics of one’s IM requirements, and trades are mispriced as a result.

KVA is another topic for debate; however, this issue is more conceptual than computational. How does one predict or model future capital regulations? Which capital components should be included in the KVA simulation? Which hurdle rate or cost of capital is correct? 

Whereas for the MVA calculation clear consensus is forming that a Monte Carlo simulation of future sensitivities is required to capture the correct level of accuracy, we expect the KVA debate to continue for some time before a market standard becomes clear.

Satyam Kancharla: MVA and KVA. MVA because it is fairly new and compute‑intensive – it requires the calculation of future exposures and future sensitivities – and will certainly further impact the economics of OTC trading. Some firms take a simple approach to addressing MVA, but that is a mistake. The methodology should not be diluted so much that MVA ceases to express the properties and complexities embedded within it. MVA is the most complicated XVA and should be comprehensively addressed. 

KVA is also compute‑intensive, but the greatest challenge with it is the plausibility of a changing capital regime.

 

*The views expressed by Antonina Harden are her individual views and do not necessarily reflect those of NFA.

Read more articles from the 2018 XVA special Report

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here