Podcast: Piterbarg and Antonov on alternatives to neural networks

Two novel approximation techniques can overcome the curse of dimensionality

The rise of neural networks in finance has been rapid and nearly uncontested. The idea that this type of artificial intelligence is the best solution for all manner of problems is almost casually accepted by quants. But there are some sceptics – one such being Vladimir Piterbarg, head of quantitative analytics and development at NatWest Markets. Piterbarg jokes that his warnings about poorly thought-out applications of neural networks, which he has compared to a hammer in search of nails, have given him a bad reputation in the industry.

Despite many successful applications in finance and especially non-financial areas, Piterbarg argues quants too often turn a blind eye to the drawbacks of neural networks, such as training time, data availability, and the predictability and interpretability of outputs.

He has spent the past two years working with Alexandre Antonov, quantitative research and development lead at the Abu Dhabi Investment Authority, on alternative approaches to solving quantitative problems that do not have the same drawbacks.

In this episode of Quantcast, Piterbarg and Antonov discuss the two methods they developed to approximate functions that are cumbersome to calculate – for instance, because they involve lengthy simulations – and to compute conditional expectations.

“We’ve come up with much better and much faster methods for the typical problems that we see in finance,” says Piterbarg.

 

The two methods are called generalised stochastic sampling (GSS) and functional tensor train (FTT). Antonov describes GSS as “a parametric representation of the function”. The method involves setting up a grid of randomly distributed bell-shaped points to represent the multi-dimensional space of the function. Randomly selected points are then used to reconstruct the function. The approach mimics image reconstruction and helps avoid the curse of dimensionality that classic parametric models would encounter in a multi-dimensional space.

“If the function is relatively smooth, then our method’s precision is higher [than that of neural networks],” explains Antonov. “All the problems we have described for neural networks are more or less overcome.”

FTT reduces the dimensionality of a function. Inspired by a 2011 paper by Ivan Oseledets, this method allows a 10-dimensional function to be decomposed into the product of two-dimensional functions. “That gives structure, explainability, and much faster performance,” says Piterbarg

The two methods can be combined to “translate the problem from the function being sampled on a stochastic into a linear tensor train problem”, says Piterbarg. That lightens the problem considerably, while still allowing for multi-dimensional functions.

These methods can be applied to any problem described by functions that are slow to calculate, as well as problems related to the computation of conditional expectations, such as payoffs of complex financial products. The computation of derivatives valuation adjustments, which require the calculation of a large number of present values of simulated paths, is another use case.

It’s too early to know how widely this approach will be adopted in the industry. “NatWest Markets is looking at it,” says Piterbarg, adding the caveat that the research has only recently been published and it takes time for banks to decide whether to switch to a new model.

But he’s optimistic that the industry will see the value of GSS and FTT. “For a certain class of problems that are important in finance, this method beats neural networks hands down, there’s no doubt in my mind about it.”

Index

00:00 Intro

01:48 The drawbacks of neural networks

08:00 The need for alternative approaches

10:38 Generalised stochastic sampling

17:40 Functional tensor train

24:00 Combining the methods in practice

30:10 Comparing GSS and FTT to neural networks

31:40 Next steps

To hear the full interview, listen in the player above, or download. Future podcasts in our Quantcast series will be uploaded to Risk.net. You can also visit the main page here to access all tracks, or go to the iTunes store, Spotify or Google Podcasts to listen and subscribe.

  • LinkedIn  
  • Save this article
  • Print this page  

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe

You are currently unable to copy this content. Please contact info@risk.net to find out more.

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here: