Why not having AAD needn’t be the end of the world

Optimisation method offers quicker and more focused way of making XVA calculations

AAD rejection

It's been more than half a decade since the start of the adjoint algorithmic differentiation (AAD) revolution. Previously, major banks had been engaged in something of an arms race, with some investing enormous sums in an effort to gain mightier processing power. But AAD has rendered much of that power redundant. The mathematical technique is capable of speeding up the calculation of risk sensitivities, or Greeks, by up to a thousand times compared with traditional methods, and is now being used by firms for a host of different applications.

Traditionally, risk sensitivities are obtained by adjusting or 'bumping' inputs one-by-one and valuing derivatives prices over and over again for every input and every trade. This adds up to a large number of calculations, which sometimes might take days to run. The process gets even more complicated when the risk sensitivities to be calculated are those of derivatives valuation adjustments (XVAs).

In contrast, AAD works out the sensitivities using a single pricing calculation, exploiting the chain rule of differentiation that allows for the simultaneous calculation of sensitivities. The result is an almost real-time calculation of risk. The downside is that the method is tough for many to get their heads around, and banks have to rewire their IT architecture around each application to be able to run it. This puts banks that haven't yet implemented AAD at a massive disadvantage.

However, it looks like there is now a workaround.

In a paper published recently on Risk.net, titled Risk optimisation: the noise is the signal, Benedict Burnett, Tom Hulme and Simon O'Callaghan – all of whom work at Barclays in London – propose a technique for optimising risk calculations without AAD. In it, the authors show that speeds similar to AAD can be achieved by identifying risks that aren't material and spending less time on them, using the example of XVA risk calculations.

Typical XVA books have thousands of counterparties, but not all of them contribute in the same way towards each risk, with some being riskier than others. If banks were to run simulations of the same specification for all counterparties, a lot of computational effort would be wasted on counterparties that are not particularly risky. "We fell into that trap and probably other banks did that as well, imposing a structure at the start rather than letting the numbers speak for themselves," says O'Callaghan.

In order to avoid this, the authors run so-called "lightweight" simulations, with fewer paths for each counterparty, as a preliminary step to see how significantly they affect overall error in estimation. From this, they figure out the number of simulation paths and the frequency of time points to be used in running the simulations for each counterparty. These two factors affect how computationally intensive the calculations will be, so changing them allows the user to calculate risks with the speed and level of precision that traders require.

The basic idea is to focus on keeping the time spent on each risk proportionate to its error, minimising the overall error for the whole portfolio for a given period of time. "If I have 10 minutes to get the result, it can be much more accurate, but if I need it in a few minutes or 20 seconds, I would still be able to choose the right target amount of time," says Hulme. "It is up to the users. When there has been a very large move – like Brexit, for example – and traders need to quickly rerun the batch intraday on a less accurate but faster run, they can do that by changing a single parameter."

By applying a lower number of paths and time-steps for less material counterparties, the risks of the overall XVA book can be calculated much faster. More precise calculations can be carried out for the riskier profiles by running a greater number of paths and time-steps. Because of this flexibility, the Greeks can be obtained one to two orders of magnitude faster than it would take to run a complete simulation, the authors say.

What this means is that not all banks have to rely on the magic of AAD to quickly calculate their risks.

"Everyone would want to implement AAD since it would increase precision and stability with several orders of magnitude in performance improvements; however, the cost of doing that is just too expensive," says Francois Bergeaud, the head of XVA quantitative analytics at Royal Bank of Scotland in London. "It is the fastest thing you can think of, but if you don't want to invest in a whole team of quants – say, 10 people – for three years working on refactoring the library, then this approach is a good practical compromise."

As the role of quants becomes increasingly driven by technology and optimisation, some tricks of the trade will survive and others will die out, in a tough evolutionary battle accelerated by rising costs and regulatory changes. AAD is not likely to be made extinct any time soon, but the optimisation proposed by the authors appears to give banks with fewer resources an option that puts them on a level playing field. That's all the more important at a time when risk management and capital savings come with a huge price tag attached.

Also out this month: Operational risk modelled analytically II: classification invariance, by Vivien Brunel

  • LinkedIn  
  • Save this article
  • Print this page  

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an indvidual account here: