Quant of the year: Alexandre Antonov

Numerix quant revolutionises negative rates modelling

alexander-antonov-4
Alexandre Antonov: "We realised once rates approach zero, they tend to stick to zero"

Numerix quant revolutionises negative rates modelling

When key interest rates in Europe breached zero in 2012, dealers were presented with a brand-new problem – standard derivatives-pricing models were designed strictly to work with positive rates and a fix would be needed to get them to accept negative values as well. Quants had already been struggling with misbehaving models as rates slid towards the zero threshold and the prospect of taking it a step further was daunting.

So, the industry settled for a crude fix – shifting the distribution of the rate by a small amount, so it remained positive. It worked, but came at a cost. Every time rates became too negative, the model had to be recalibrated to determine a new value for the shift. No one complained though, because a better solution did not exist until the topic drew the attention of Alexandre Antonov, Risk's quant of the year for 2016.

Antonov, a senior vice-president in the quantitative research team at Numerix, published three papers in Risk this year, each commended by reviewers for its topicality and practicality, but the one that stood out the most to his peers was The free boundary SABR: natural extension to negative rates, which he published in September with colleagues Michael Konikov and Michael Spector.

Work started roughly a year earlier, with Antonov pitching the idea to his co-authors that introducing a so-called ‘free-boundary' condition to the commonly used stochastic alpha beta rho (SABR) interest rate model might eliminate the need for a shifted model for negative rates. The purpose behind a free-boundary condition was to allow rates to go to any negative value, so there was no need for a fixed shift that had to be calibrated frequently.

The idea was revolutionary. "The ground-breaking feature of the model is that you don't decide in advance how negative the rates could become. With other models you have to, but then markets would change and they would become more negative than you previously thought. Then you would have to change your parameters, causing a profit-and-loss adjustment," says Alexander Sokol, chief executive officer and head of quantitative research at technology vendor CompatibL.

The ground-breaking feature of the model is that you don't decide in advance how negative the rates could become
Alexander Sokol, CompatibL

For Antonov and his colleagues, the major breakthrough came when they did an initial run on the model. At this stage, the free-boundary condition was producing a rather unappealing spike at zero for the distribution of the rates. In quantitative finance, a field obsessed with smooth curves and elegant solutions, spikes are generally seen as a sign of misbehaviour and models producing them usually do not make it very far. However, Antonov noticed something strange in the observed distributions of the Swiss bank Libor rates they had spikes at zero as well.

There seemed to be a common-sense explanation. "Policymakers would probably take time to think about whether they want to go negative and then suddenly make a decision, so we realised once rates approach zero, they tend to stick to zero," says Antonov.

The ‘stickiness' of rates near zero caused the distribution to spike at that point, so having that same feature in the model suddenly made sense. "We looked at this graph and realised a good model can not only accept negative rates but should have some sort of singularity at zero as well," he adds.

The free boundary, which is the core of Antonov's model, is a combination of an absorbing and a reflecting condition – the former is used to make rates sticky near zero while the latter ensures they can go to any negative value once they fall below the threshold.

"We tried to combine many solutions of the absorbing and reflecting conditions in the right way, such that SABR can go negative with realistic properties of probability such as martingality, for example. In the end we got a unique solution, which also led to the probability density spike at zero," says Antonov.

The resulting model was analytically solvable, like the shifted SABR approach, but it also offered two additional benefits – it did not require constant calibration and it matched observed dynamics.

One of the fans of this work is Paul Glasserman, a professor at Columbia Business School, who found the so-called ‘stickiness' around zero to be a game-changer. "If you look at actual rates, they are spending a lot of time near zero, so zero has special status in the data, and his model reproduces that feature and comes up with a clean, elegant solution. It kind of blows you away," he says.

Clever tricks abound in quantitative research, but not many can be used in practice. Antonov's free-boundary SABR is an exception to that rule, advocates say.

One needs to make a big intellectual and programming effort. You see a lot of people trying to dig in this direction. At the moment the story is not final
Alexandre Antonov

CompatibL's Sokol argues the model could help traders to hedge better and save money as a result: "While using other shifted-SABR models, traders will usually not hedge close to the negative boundary because of unstable sensitivities. They would instead allocate additional funds in reserves in order to pay for their inability to hedge."

With the free-boundary SABR, this is no longer an issue. "It does not require adjustments to the model parameters, so you are able to hedge better, and every time the traders can hedge better and avoid taking a reserve, they are able to charge less for the trade. This can save market participants money every day," says Sokol.

Many dealers have already jumped at the opportunity. According to Antonov, at least a dozen European banks are now either implementing or using the model in their systems.

Before Antonov started his career at Numerix almost 18 years ago, he describes himself as "a pure theoretician with a pen and a paper", with no knowledge of programming. It was while researching quantum field theory in Paris shortly after completing his PhD from the Landau Institute for Theoretical Physics, part of the Russian Academy of Sciences, that he was hired by Sokol, one of the founders of Numerix, in 1998.

Now, coding and algorithm design, which form the backbone of efficient practical implementation at banks, make up a big chunk of what Antonov does. His peers who voted for him this year cite practical benefits as the defining feature of most of his papers.

In January 2015, Antonov and his colleagues, Serguei Mechkov and Serguei Issakov, published Backward induction for future values, which offers an exposure calculation methodology for exotics that pares not only the computational effort, but also the time taken to write the code.

Typically, exposures for these products, which form the main ingredient for calculating valuation adjustments (XVAs) and value-at-risk, are computed through a classical American Monte Carlo (AMC) method. AMC is carried out by first simulating a backward Monte Carlo, while carefully calculating the exercise conditions of the exotics, and then running a forward Monte Carlo, aggregating the final result. Antonov and his colleagues eliminate the second step by calculating the future value directly in the backward-looking routine.

The main contribution, however, was the ease with which the algorithm could be coded. Structured deals are often described in pricing systems through a payoff language. Calculating the future values using the standard AMC requires additional logic on top of that pricing script and is heavily dependent on the type of deal, requiring traders to modify each of these scripts when calculating the various XVAs. Antonov and his colleagues designed the algorithm in such a way that the pricing step was generalised – in that it could be applied uniformly across asset classes.

"This is very important for XVA calculations, especially doing them generically in real production systems rather than toy examples. Without this, a quant or a group of quants will spend months, if not man years, coding each payoff by hand for XVAs. Here, the computer can do all the work," says two-time quant of the year winner Vladimir Piterbarg, who heads the quantitative analytics team at Rokos Capital Management in London.

Without this, a quant or a group of quants will spend months, if not man years, coding each payoff by hand for XVAs. Here, the computer can do all the work
Vladimir Piterbarg, Rokos Capital Management

Antonov's third paper, FVA for general instruments, written with Numerix colleague Ion Mihai and Intesa Sanpaolo senior quant Marco Bianchetti, was another contribution towards generalising calculations across asset classes. The quants proposed an approximation method for the calculation of the funding valuation adjustment (FVA) that can be applied easily to both vanilla and exotic derivatives.

FVA is a pricing adjustment that reflects the cost of funding the collateralised hedge on an uncollateralised trade. It is very product-dependent, but generally involves the tough task of solving a non-linear partial differential equation (PDE). Antonov's solution, on the other hand, can be applied to a wider range of instruments, including those with path-dependent and callable features, and for different specifications of FVA.

In keeping with his track record for seeking practical solutions, Antonov already has his eyes on what he considers to be the next big implementation challenge: adjoint algorithmic differentiation (AAD). This is a mathematical trick that can boost computing speeds for risk sensitivities by up to a thousand times and is already in use at major dealers such as Barclays, Credit Suisse, Nomura and UBS.

Many complain the initial setup of the algorithm is challenging though, involving a complete revamp of the IT libraries within banks and the storage of many intermediate values.

"A relatively quick implementation of AAD can be done only in a limited setup. Moreover, a naive AAD will face very significant memory consumption. To make it general and memory efficient, one needs to make a big intellectual and programming effort. You see a lot of people trying to dig in this direction. At the moment the story is not final," says Antonov.

If it is a practical problem, Antonov can be expected to crack it. "In all his papers there is a clear practical problem, amazing mathematics and practical implementation. I think the combination of those three elements is really quant work at its best," says Columbia University's Glasserman.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe

You are currently unable to copy this content. Please contact info@risk.net to find out more.

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here