# Giving the Omega ratio a new lease of life

## Johnson-Omega could change the way financial firms measure portfolio performance

At the time of its introduction in 2003, the Omega ratio was expected to replace the popular, but flawed, Sharpe ratio as the standard for measuring portfolio performance at financial firms. Riddled with practical challenges, however, the measure didn't stick.

Could a tweak in the distribution used by the ratio bring it back to prominence? Alexander Passow, a research fellow at the University of Darmstadt and founder of advisory firm JP Omega, thinks so.

The Omega ratio, defined as a probability-weighted ratio of gains versus losses given a certain threshold, arrived on the scene over a decade ago in a bid to help the industry include as much information as possible in models. At the time, most models assumed normal or lognormal distributions and restricted modelling to a few 'moments' – the measures that describe a distribution. In contrast, the Omega ratio uses an empirical distribution, which is built using all the data points available, without assuming any distribution characteristics.

While this should have been a welcome change, it didn't succeed in replacing the simpler and more popular Sharpe ratio.

One of the reasons for this is that finding an 'optimal portfolio' using the Omega ratio is a tough task, because it means optimising all the moments in the empirical distribution. That introduces a great deal of noise.

"If you optimise with respect to a hundred data points, with the empirical Omega function, you end up optimising moments higher than five and up to a hundred," says Passow. "Optimising moments where the order approaches the number of data points is misleading, because you are optimising with respect to noise."

Portfolio optimisation means a careful balancing act when it comes to the number of factors used in the model. Consider too few, and you might miss out on important information. Prior to the 2008 financial crisis, for instance, the likelihood of turmoil in the market was much better reflected by the skewness – a measure of the asymmetry of the distribution – and kurtosis – or the fatness of the tail – than by volatility and correlation between assets.

But use too many factors and optimisation becomes difficult and computationally expensive. Furthermore, using too many factors can introduce noise and errors. Typically, firms find the first four moments – mean, variance, skewness and kurtosis – to be sufficient for capturing the relevant information. Anything more is considered to be overkill.

A second problem with the Omega ratio in its original form is that the empirical distribution is not a smooth function, because it is built using all the data points without any distributional assumptions. So its cumulative density function has plateaus and does not strictly increase. That makes it difficult to discover optimal portfolios, because optimisation algorithms tend to get stuck at these plateaus instead of moving on with the exercise.

Those challenges mean the industry missed a chance to replace the Sharpe ratio – a measure that, despite its prolific use among hedge funds, is often criticised for being useful mostly in the case of more liquid and diversified portfolios.

In his paper, Johnson-Omega performance measure, Passow offers a response. The crux is to change the distribution used in the Omega function from the empirical to the Johnson distribution.

The Johnson distribution is defined by four parameters: mean, variance, skewness and kurtosis. Optimising a portfolio based on those four moments does not introduce any noise into the exercise, since higher moments are endogenously accounted for by the four moments of the distribution.

"If you have a hundred data points, you only consider the four moments which can capture crucial characteristic features of the asset classes," explains Passow. "For example, mortgage-backed securities have significant negative skewness, extremely fat tails and very low volatility."

Significantly, other portfolio managers who have used the measure like it. "The Johnson distribution is a good one to use," says Peter Urbani, a senior portfolio and risk manager at Sanlam Investments in Cape Town.

A convenient feature is that the Johnson-Omega can be used just like the Sharpe ratio. The Sharpe ratio is defined as the excess return of an asset divided by risk or volatility. With Johnson-Omega, the formula incorporates skewness and kurtosis and can be optimised with a closed-form solution.

It's not that financial firms love simple models. In many cases, implementation challenges prevent them from using those that are more complex – even if they are arguably more accurate. In recent times, quantitative finance has seen a shift away from sexy topics such as exotics pricing and towards nuts-and-bolts issues such as computational efficiency, coding and what sort of distribution to use.

Now that the shift has made the Omega function easier to tame, perhaps it might finally replace the Sharpe ratio, after all.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact [email protected] or view our subscription options here: http://subscriptions.risk.net/subscribe

#### More on Risk management

###### Top 10 op risk losses for 2021 hog $15bn total ###### New model simplifies loan-loss forecasts. Some say it’s too simple ###### Four assurance mega-trends shaping risk management ###### Back in the New York groove: BNY CRO’s risk revamp ###### Review of 2021: Default, revolt, reform ###### Buy side turns to extreme value theory to spot tail risks ###### CCPs unlikely to be wiped out by op losses, research suggests ###### EU’s IM model validation rules may put Simm in jeopardy #### Risk Management ###### Top 10 op risk losses for 2021 hog$15bn total

Despite more fraud and crypto crime, firms’ op risk losses fall in number and in volume. Data by ORX News