Why not having AAD needn’t be the end of the world
Optimisation method offers quicker and more focused way of making XVA calculations

It's been more than half a decade since the start of the adjoint algorithmic differentiation (AAD) revolution. Previously, major banks had been engaged in something of an arms race, with some investing enormous sums in an effort to gain mightier processing power. But AAD has rendered much of that power redundant. The mathematical technique is capable of speeding up the calculation of risk sensitivities, or Greeks, by up to a thousand times compared with traditional methods, and is now being used by firms for a host of different applications.
Traditionally, risk sensitivities are obtained by adjusting or 'bumping' inputs one-by-one and valuing derivatives prices over and over again for every input and every trade. This adds up to a large number of calculations, which sometimes might take days to run. The process gets even more complicated when the risk sensitivities to be calculated are those of derivatives valuation adjustments (XVAs).
In contrast, AAD works out the sensitivities using a single pricing calculation, exploiting the chain rule of differentiation that allows for the simultaneous calculation of sensitivities. The result is an almost real-time calculation of risk. The downside is that the method is tough for many to get their heads around, and banks have to rewire their IT architecture around each application to be able to run it. This puts banks that haven't yet implemented AAD at a massive disadvantage.
However, it looks like there is now a workaround.
In a paper published recently on Risk.net, titled Risk optimisation: the noise is the signal, Benedict Burnett, Tom Hulme and Simon O'Callaghan – all of whom work at Barclays in London – propose a technique for optimising risk calculations without AAD. In it, the authors show that speeds similar to AAD can be achieved by identifying risks that aren't material and spending less time on them, using the example of XVA risk calculations.
Typical XVA books have thousands of counterparties, but not all of them contribute in the same way towards each risk, with some being riskier than others. If banks were to run simulations of the same specification for all counterparties, a lot of computational effort would be wasted on counterparties that are not particularly risky. "We fell into that trap and probably other banks did that as well, imposing a structure at the start rather than letting the numbers speak for themselves," says O'Callaghan.
In order to avoid this, the authors run so-called "lightweight" simulations, with fewer paths for each counterparty, as a preliminary step to see how significantly they affect overall error in estimation. From this, they figure out the number of simulation paths and the frequency of time points to be used in running the simulations for each counterparty. These two factors affect how computationally intensive the calculations will be, so changing them allows the user to calculate risks with the speed and level of precision that traders require.
The basic idea is to focus on keeping the time spent on each risk proportionate to its error, minimising the overall error for the whole portfolio for a given period of time. "If I have 10 minutes to get the result, it can be much more accurate, but if I need it in a few minutes or 20 seconds, I would still be able to choose the right target amount of time," says Hulme. "It is up to the users. When there has been a very large move – like Brexit, for example – and traders need to quickly rerun the batch intraday on a less accurate but faster run, they can do that by changing a single parameter."
By applying a lower number of paths and time-steps for less material counterparties, the risks of the overall XVA book can be calculated much faster. More precise calculations can be carried out for the riskier profiles by running a greater number of paths and time-steps. Because of this flexibility, the Greeks can be obtained one to two orders of magnitude faster than it would take to run a complete simulation, the authors say.
What this means is that not all banks have to rely on the magic of AAD to quickly calculate their risks.
"Everyone would want to implement AAD since it would increase precision and stability with several orders of magnitude in performance improvements; however, the cost of doing that is just too expensive," says Francois Bergeaud, the head of XVA quantitative analytics at Royal Bank of Scotland in London. "It is the fastest thing you can think of, but if you don't want to invest in a whole team of quants – say, 10 people – for three years working on refactoring the library, then this approach is a good practical compromise."
As the role of quants becomes increasingly driven by technology and optimisation, some tricks of the trade will survive and others will die out, in a tough evolutionary battle accelerated by rising costs and regulatory changes. AAD is not likely to be made extinct any time soon, but the optimisation proposed by the authors appears to give banks with fewer resources an option that puts them on a level playing field. That's all the more important at a time when risk management and capital savings come with a huge price tag attached.
Also out this month: Operational risk modelled analytically II: classification invariance, by Vivien Brunel
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe
You are currently unable to print this content. Please contact info@risk.net to find out more.
You are currently unable to copy this content. Please contact info@risk.net to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@risk.net
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@risk.net
More on Risk management
Big banks love their climate vendors; small banks, not so much
Risk Benchmarking: Lenders with blue-chip loan books more likely to favour climate tools, research finds
Mob rule: populism’s rise pits banks against the people
Trump and fellow mavericks are reshaping politics, leaving banks scrambling to adjust to new and unpredictable risks
JSCC considers default fund consolidation
Japanese clearing house looks for efficiency gains amid expansion of clearing products and influx of international firms
EU clearing houses pressured to diversify cloud vendors
CROs and regulators see tech concentration risk as a barrier to operational resilience
Why better climate data doesn’t always mean better decision-making
Risk Benchmarking research finds model and systems integration challenges almost as limiting to effective climate risk management
CanDeal looks to simplify third-party risk management
Six-bank vendor due diligence utility seeks international reach
Market players warn against European repo clearing mandate
Regulators urged to await outcome of US mandate and be wary of risks to government bond liquidity
Italy’s spread problem is not (always) a credit story
Occasional doubts over Italy’s role in the monetary union adds political risk premium, argues economist