Higher and higher

High-frequency data


Each day, thousands of traders around the world sit watching prices tick by on their screens. They might trade on some, and over the course of the day their institutions will record a selection - market open and close, high and low, and quotes and prices at which they transacted. But the majority of the torrent of data continuously pumped out by exchanges, trading platforms and market data vendors disappears into the ether. Along with it goes an abundance of evidence of hidden dynamics in the markets.

So-called real-time trading and risk systems might respond more or less immediately to market events, but few are based on an analysis conducted at the frequency at which the markets really operate. In fact, most systems incorporate assumptions about the markets based on hourly, daily or even less-frequent data. "The assumption of a lot of pricing and trading models is that on a small scale there is noise, but on a large scale there is movement," says Jessica James, global head of investor risk advisory, foreign exchange, at Citi.

However, analysis shows that far from being anarchic noise, high-frequency data can show a structure and behaviour that is as different, complex and informative as the view of the everyday physical world that is revealed under an electron microscope. Those who have become familiar with this world of tick data know that it offers opportunities that have so far been little exploited, while many who have been drawn into the high-frequency realm through algorithmic trading realise that its unique characteristics cannot be ignored.

One of the first to become curious about what an examination of high-frequency financial data would reveal was former foreign exchange trader Richard Olsen. In 1986, he began collecting tick data for the forex markets - an activity he has continued without interruption ever since. Olsen's aim was to emulate the natural sciences, where data collection and analysis and the iterative development of models had proved extraordinarily powerful compared with the traditional method of theorising first and looking for evidence later.

One of the first preconceptions a study of the forex data challenged was the notion of a price. "When you start to collect high-frequency data, you quickly realise that a price is a complex object," says Olsen, who is now chief executive of Zurich-based Olsen, a group of companies that includes data services (Olsen Financial Technologies), online currency trading and conversion services (Oanda), and a hedge fund (Olsen Invest). Among the questions that arise are: is the price a bid or offer; is it a quote or a traded price; who quoted it; at what intervals does the source publish the price; and what does the source do when actual prices are not available?

Unlike exchanges, where prices and volumes are available for deals that have been transacted, prices in an over-the-counter market such as forex are far more ambiguous. "When you study the data, you realise it is quotes, when what you are really interested in is transactions," says Olsen.

As the database grew and Olsen and his colleagues applied data analysis techniques borrowed from natural sciences, they made a number of important discoveries - discoveries that not only helped interpret the ambiguity of the price quotes pouring out of the market, but also suggested that, at the tick level, the market exhibits a structure and behaviour that has profound implications for our understanding of markets in general. Most importantly, it suggested that with the right tools and technologies, these could be profitably exploited. Twenty years later, nearly 300 organisations license Olsen's data, including many universities researching market micro-structure, as well as a number of banks and hedge funds pursuing strategies based on high-frequency data analysis and algorithmic trading.

One of Olsen's first discoveries was a daily seasonality in the global forex market. Over a 24-hour period, volatility exhibits a pattern that correlates consistently with the opening and closing times of the markets in the US, Europe and Asia. Lowest volatility occurs during the lunch hour in Japan, when it is night in the US and Europe, and the highest activity occurs in the early European afternoon, which coincides with morning in the US. This effect is completely overlooked if the market is simply sampled at the same hour each day.

To take the effect into account when evaluating a price, Olsen suggested imposing a trader time frame over the tick data, which stretches time during periods of high market activity and compresses it during low periods in order to put the relative importance of events in more realistic perspective. Good traders have an intuitive sense of this effect and adjust for it, says Olsen, but his group's research was the first to demonstrate it statistically.

A second key discovery was that the pattern of forex prices measured at one scale, say 10 minutes, is similar to the pattern when measured at other scales, say hourly or daily. This scaling law is the characteristic of fractals, and it was Benoit Mandelbrot, the father of fractal science, who first suggested the financial markets might have fractal properties when he examined corn prices from the 1950s.

Ramazan Genaay, professor of economics at Simon Fraser University, Vancouver, and author along with Olsen, Michel Dacorogna, Ulrich Muller and Olivier Pictet of An introduction to high frequency finance, says one of the most important things the study of high-frequency data reveals is the heterogeneity of market participants. "Each tick that you observe can correspond with a different class of trader with different characteristics," he says.

One tick might be from a market-maker, the next from a small hedge fund trading algorithmically, the next from a corporate trading infrequently and the next from an institutional trader working on a large order. Only at the tick level are the dynamics of the interaction between these different groups visible. Occasionally, the views of these different classes of traders suddenly coalesce and everyone acts in a similar way for a short period. "Most of the time, there is quite a bit of spread across how people see things in the market, then something kicks in and they start thinking similarly," says Genaay.

Although these discoveries, and the availability of high-frequency data, have spawned a hive of university research, they are of more than academic interest. Traders can profit from identifying when periods of homogeneous thinking kick in - periods that might last for only minutes or seconds. "The trading time horizon might be only five to 10 seconds, but a trader can make a reasonable return in five to 10 seconds," says Genaay.

Meanwhile, trader time and the scaling law are essential to establishing the credibility of individual prices at the tick level. With the benefit of hindsight, it can be relatively easy to judge whether an individual quote was a tradable price or not, says Rakhal Dave, chief executive of Olsen Financial Technologies. However, an algorithmic trading system aimed at fleeting inefficiencies in the market has no such luxury. It must make a judgement call as the price flashes through the network. Knowing whether or not the price falls during a period where higher market volatility is expected, and whether it is within a threshold that the scaling law would suggest, can increase the accuracy of a system's snap judgement.

Algorithmic, or even human, traders dealing at high frequency need such techniques to judge prices because market fundamental knowledge is of little help at the tick level. "The dynamics of high-frequency data are almost purely internal, driven by their own feedback mechanisms," says Citi's James. "If you look at annual data, you find it connected to underlying economics, and if you look at medium-term data - weekly or daily - you will find drivers in things like carry trades, interest rates and equities. But tick-by-tick moves simply don't care about these things - they just care about what was happening the minute before and how everyone reacted to it."

Although much of the research into high-frequency finance has been conducted on forex data, the findings often apply more widely, adds James: "The defining characteristics of high-frequency data hold for all markets, and you can often transfer models between asset classes with more facility than you could do with models based on less-frequent data."

Citi licenses some of Olsen's data, which now includes futures, options, interest rates, swaps and equity indexes, although James's group concentrates its research on frequencies slightly lower than tick level. Nevertheless, it has found evidence of volume flows and spikes around events such as economic data releases (for instance, non-farm payroll figures and interest rate announcements) and option expiries. "When options expire in different time zones and in different currencies, there are big spikes in volumes," says James. "When lunchtime occurs, the markets know all about it. When there are data releases for a particular currency pair, you get big spikes - as much as four times normal trading in the five minutes before and the five minutes after a data release. The ebb and flow through the day is quite astonishing." Citi makes some of its analysis available to its clients and uses it to help in the timing of trades.

While the discoveries of the study of high-frequency data are slowly filtering through into the wider world, a number of hedge funds have picked up this information and run with it over the years. But this is a secretive world, and few funds are prepared to speak openly about what they do.

In 1995, fixed-income and equity options trading specialist Ephraim Gildo set up a hedge fund called Arbitrade to trade options based only on high-frequency data analysis. The company was extremely successful, and Gildor sold it to New Jersey-based broker-dealer Knight in 2000 for more than $400 million. Gildor now runs New York-based hedge fund Axiom Investment, which has so far used high-frequency data for its risk models, although it is beginning to use it for trading predictions as well. Gildor believes the move to high-frequency trading is simply another instance of the scaling law - people are using tick data in the same sorts of ways they once used daily or monthly data. "It is just that the resolution with which traders view the market is getting higher," he says.

Olsen's own hedge fund, which is based entirely on high-frequency analysis and trading, has $125 million under management. It trades spot forex only in a completely automated straight-through processing system that makes around 3,000 trades a day. The fund is up by around 5% so far this year. "But the remarkable thing is its stability - it achieves a very stable risk-return profile," says Olsen.

However, running such a trading set-up is no easy task. First, the sheer volume of tick data to be collected and stored is daunting. For the euro/dollar currency pair alone, Olsen collected more than 3 million ticks in May, while the electronic mini S&P 500 futures market at the Chicago Mercantile Exchange produces around 1 million ticks a day, says Dave.

Then there is the cleanliness of the data. At high frequencies, price errors are inevitable and treacherous. "The danger of data that is not clean is that it can mislead you big time," says Genaay.

Olsen agrees: "The more sophisticated your application, the more sensitive you become to the quality of the data." Companies such as Olsen and TraderMade International, based in Kent, UK, go to great lengths to identify rogue prices and fill gaps rationally with the data they provide.

Then there is the challenge of constructing a system that can handle the data volumes and the speed at which it must be analysed, decisions taken and orders routed to exchanges. Tyler Moeller came up against these problems while working with the automated trading group at Chicago-based hedge fund Ronin Capital. Moeller realised his company needed a more efficient way of integrating the various high-frequency data sources and analytic and trading systems that it used. A particular difficulty was that in order to reduce latency, processes should be carried out at the lowest level possible within the system, preferably in machine code directly on the hardware. But in dynamic markets where opportunities come and go, traders need the flexibility to amend and extend the system themselves.

"Generally, the more high-performance you try to make a system, the less abstract and easy it is to build on, and vice versa," says Moeller. His answer was to create an abstraction layer between the low-level operations and the trader, whereby disparate data sources appear in an identical form, and a standard input format allows the trader to place orders across disparate markets from a single screen. Underneath, fine-tuned technology using data compression, normalisation and other techniques translates and moves data and orders very fast.

Moeller has now commercialised the technology, creating New York-based Broadway Technology, which counts banks, hedge funds and broker-dealers among its customers, including BNP Paribas, Ronin and Cantor Fitzgerald.

The trend across markets is to move to higher frequencies, and all participants are facing challenges in terms of their analysis and response to prices, events and opportunities. "The processing speed race is leading to an increasing requirement for quantitative and technological approach to trading," says Paul-Henry Bacher, head of electronic trading at Credit Suisse in London. "This has also unlocked a number of new opportunities, as the capability to collect and analyse historical tick and order book data can lead to more accurate identification of significant patterns in the market data that can be exploited. This is where high-frequency data and its analysis are proving very valuable today, when combined with the fastest trading technology, in a wide range of fields including optimal order execution, competitive market-making or statistical arbitrage."

But what is the impact of all this high-frequency activity on the markets as a whole? Olsen believes it is benign, bringing greater stability. "The core function of a market is to provide liquidity; we provide liquidity," he says. "When the market overshoots in any one direction, we take the other side - we're a stabiliser."

Jean-Philippe Bouchaud, chairman and chief scientist at Paris-based hedge fund Capital Fund Management, who publishes a steady stream of papers on high-frequency data analysis, agrees. "The frontier between high-frequency strategies and market-making is very slim," he says. "High-frequency strategies often act as liquidity providers to markets and, hence, should help with quenching volatility at lower frequencies. I believe that the recent marked downward trend in the volatility is due to the proliferation of high-frequency players."

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe

You are currently unable to copy this content. Please contact info@risk.net to find out more.

Next-generation technologies and the future of trading

At a Risk.net webinar in association with capital markets technology provider Numerix, panellists discuss the potential for increased adoption of the public cloud to boost investment performance, its impact on risk management and overcoming barriers to…

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here