Podcast: Piterbarg and Antonov on alternatives to neural networks
Two novel approximation techniques can overcome the curse of dimensionality
The rise of neural networks in finance has been rapid and nearly uncontested. The idea that this type of artificial intelligence is the best solution for all manner of problems is almost casually accepted by quants. But there are some sceptics – one such being Vladimir Piterbarg, head of quantitative analytics and development at NatWest Markets. Piterbarg jokes that his warnings about poorly thought-out applications of neural networks, which he has compared to a hammer in search of nails, have given him a bad reputation in the industry.
Despite many successful applications in finance and especially non-financial areas, Piterbarg argues quants too often turn a blind eye to the drawbacks of neural networks, such as training time, data availability, and the predictability and interpretability of outputs.
He has spent the past two years working with Alexandre Antonov, quantitative research and development lead at the Abu Dhabi Investment Authority, on alternative approaches to solving quantitative problems that do not have the same drawbacks.
In this episode of Quantcast, Piterbarg and Antonov discuss the two methods they developed to approximate functions that are cumbersome to calculate – for instance, because they involve lengthy simulations – and to compute conditional expectations.
“We’ve come up with much better and much faster methods for the typical problems that we see in finance,” says Piterbarg.
The two methods are called generalised stochastic sampling (GSS) and functional tensor train (FTT). Antonov describes GSS as “a parametric representation of the function”. The method involves setting up a grid of randomly distributed bell-shaped points to represent the multi-dimensional space of the function. Randomly selected points are then used to reconstruct the function. The approach mimics image reconstruction and helps avoid the curse of dimensionality that classic parametric models would encounter in a multi-dimensional space.
“If the function is relatively smooth, then our method’s precision is higher [than that of neural networks],” explains Antonov. “All the problems we have described for neural networks are more or less overcome.”
FTT reduces the dimensionality of a function. Inspired by a 2011 paper by Ivan Oseledets, this method allows a 10-dimensional function to be decomposed into the product of two-dimensional functions. “That gives structure, explainability, and much faster performance,” says Piterbarg
The two methods can be combined to “translate the problem from the function being sampled on a stochastic into a linear tensor train problem”, says Piterbarg. That lightens the problem considerably, while still allowing for multi-dimensional functions.
These methods can be applied to any problem described by functions that are slow to calculate, as well as problems related to the computation of conditional expectations, such as payoffs of complex financial products. The computation of derivatives valuation adjustments, which require the calculation of a large number of present values of simulated paths, is another use case.
It’s too early to know how widely this approach will be adopted in the industry. “NatWest Markets is looking at it,” says Piterbarg, adding the caveat that the research has only recently been published and it takes time for banks to decide whether to switch to a new model.
But he’s optimistic that the industry will see the value of GSS and FTT. “For a certain class of problems that are important in finance, this method beats neural networks hands down, there’s no doubt in my mind about it.”
Index
00:00 Intro
01:48 The drawbacks of neural networks
08:00 The need for alternative approaches
10:38 Generalised stochastic sampling
17:40 Functional tensor train
24:00 Combining the methods in practice
30:10 Comparing GSS and FTT to neural networks
31:40 Next steps
To hear the full interview, listen in the player above, or download. Future podcasts in our Quantcast series will be uploaded to Risk.net. You can also visit the main page here to access all tracks, or go to the iTunes store, Spotify or Google Podcasts to listen and subscribe.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe
You are currently unable to print this content. Please contact info@risk.net to find out more.
You are currently unable to copy this content. Please contact info@risk.net to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@risk.net
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@risk.net
More on Cutting Edge
Quantcast Master’s Series: Jack Jacquier, Imperial College London
A shift towards market micro-structure and ML has reshaped the programme
Quantcast Master’s Series: Kihun Nam, Monash University
Melbourne-based programme winks at pension fund sector
Quantcast Master’s Series: Petter Kolm, Courant Institute
The NYU programme is taught almost exclusively by elite financial industry practitioners
Quantcast Master’s Series: Laura Ballotta, Bayes Business School
The business school prioritises the teaching of applicable knowledge with a keen eye on the real world
The importance of modelling futures dynamics in commodity index derivatives
Index-based and underlying-based pricing methods for commodity derivatives are presented
Podcast: Iabichino on finance-native neural networks
UBS quant explains how to incorporate financial laws into an AI framework
Quantcast Master’s Series: Dan Stefanica and Jim Gatheral
Baruch College leaders on how they manage the top-ranked quant finance master’s programme
Tomorrow’s Quants: what it takes to be a next-gen modeller
Employers increasingly prize mix of hard and soft skills, Risk.net survey reveals