Spotting co-movement breakdowns with neural networks
Autoencoders can detect changes in relationship between assets in real time
Autoencoders can detect changes in relationship between assets in real time
The co-movement of financial assets is a tricky thing to measure. It is dynamic and prone to significant shifts during market upheavals. One possible approach could be to use some form of principal component analysis (PCA), a statistical technique that reduces the dimensionality of a dataset while explaining much of its variability. But most market practitioners consider PCA unfit for this purpose.
“It only really works for linear Gaussian systems, which is as far from finance as you could imagine,” says Stephen Roberts, director of the Oxford-Man Institute of Quantitative Finance, and Man Group professor of machine learning at Oxford University.
Together with Bryan Lim, a researcher at the Oxford-Man Institute, and Stefan Zohren, a senior researcher at Man Group, Roberts proposes a solution that uses autoencoders, a type of artificial neural network that relies on a bottleneck structure to reduce the size of an input dataset to a few latent factors. They introduce an indicator – the autoencoder reconstruction ratio (ARR) – which is designed to capture assets’ co-movements in real time by measuring the so-called average reconstruction error. The reconstruction error measures how well the few latent factors generated by the autoencoder are able to reproduce the original dataset.
The basic idea is that when assets move more independently – that is, their co-movements decrease – the latent factors lose some of their explanatory power and will reconstruct the original data less accurately. As the average reconstruction error increases, so does the ARR – signalling a scenario where the market is more stable and diversification strategies are more effective.
Conversely, when the latent factors are able to reconstruct more of the original data, the reconstruction error is lower and the ARR goes down, indicating that co-movements have increased. Here, correlations shoot up to dangerous levels and markets could see a spike in volatility or a drawdown, resulting in a breakdown of diversification strategies.
The ARR can also be applied to improve forecasting of realised volatility and market crashes. Numerical tests suggest this is more effective with high-frequency data. The paper shows it performs better at a five-minute frequency than with daily market data. “Being able to forecast intraday volatility is obviously very important. The ARR improves the forecast and thus helps to effectively risk-manage your position,” says Zohren.
The technique could be used to improve volatility scaling for systematic trading strategies. “The ARR enables portfolios to respond rapidly to heightened levels of asset co-movements that historically have been indicators of wide-scale market stress,” says Anthony Ledford, chief scientist at Man AHL.
Importantly, the use of ARR is not bounded by the size of the dataset, but rather by computational power – an ever-growing resource. “There is no theoretical limit to the number of assets and asset classes this metric can handle simultaneously. The only limit is computational. [With] too many assets, the answer might take too long to arrive to be useful for trading purposes,” explains Roberts.
[PCA] only really works for linear Gaussian systems, which is as far from finance as you could imagine
Stephen Roberts, Oxford-Man Institute of Quantitative Finance
The idea to employ autoencoders came from Lim. “The insight Brian [Lim] had was to realise that you can use these autoencoders to form a kind of deep sparse compression of financial assets. And you can look to see how much you’ve managed to compress in this lower dimensional representation space,” explains Roberts.
The ARR is analogous to a real-time, non-linear version of the so-called absorption ratio, which measures co-movement changes by describing the portion of variance within a system that is explained by PCA. “What we’re trying to do here is inspired by this idea [of the absorption ratio] and turns it into a nonlinear model,” says Zohren. “Interesting features and dependencies, which you could only learn using machine learning methods, are captured by the encoder. The encoder effectively takes the returns and tries to encode them in the model, which is then used for reconstructing the returns.”
“In the reconstruction ratio, the autoencoder takes the place of the PCA,” he adds.
Autoencoders are not new in finance. They are designed to recognise patterns and identify exceptions, and have been widely used in fraud detection – and to some extent, to detect trading signals. In 2018, Alexei Kondratyev, global head of data analytics at Standard Chartered Bank, proposed using autoencoders to analyse yield curves.
Kondratyev sees Lim, Zohren and Roberts’s results as part of a wider and welcome trend in quant finance. “It’s yet another paper that shows how we should move beyond PCA and other simple linear models, towards more sophisticated nonlinear models,” he says. “Now we have computational power and decades of progress in machine learning. We have the tools that allow us to do more and better than just using PCA.”
There is room for further development. One avenue could be to connect the trading signals from the model to an execution system. “An interesting extension to this would be to expand it to tie in with trade execution algorithms,” says Roberts.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe
You are currently unable to print this content. Please contact info@risk.net to find out more.
You are currently unable to copy this content. Please contact info@risk.net to find out more.
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Printing this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@risk.net
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. Copying this content is for the sole use of the Authorised User (named subscriber), as outlined in our terms and conditions - https://www.infopro-insight.com/terms-conditions/insight-subscriptions/
If you would like to purchase additional rights please email info@risk.net
More on Our take
Quants are using language models to map what causes what
GPT-4 does a surprisingly good job of separating causation from correlation
China stock sell-off will test securities firms’ risk managers
Regulatory measures to support stock market could add to risks facing securities sector
Why some UK pensions might choose to run on
Buyouts are booming but trustees are thinking about alternatives, too
Choppy inflation may be the worst inflation
Investors can build strategies to suit fast-rising prices, or slow-rising prices. What trips them up is the inflation foxtrot: slow, slow, quick, quick, slow
A dynamic margin model takes shape
New paper shows how creditworthiness and concentrations can be reflected into margin requirements
Why ‘Derivatives’ became ‘Markets’
The derivatives markets have changed drastically over the past decade. So has Risk.net’s coverage
Uncertain rates outlook poses challenge for corporate FX hedgers
Hedging programmes may need a revamp as EM/G10 rates differentials narrow
Europe’s half-baked benchmark switch leaves some dissatisfied
Users frustrated by narrow scope of euro transition, but replacing Euribor was never a euro group objective
Most read
- Quants are using language models to map what causes what
- Reluctantly, CME moves to clear US Treasuries
- The bank quant who wants to stop gen AI hallucinating