As the world's largest banks prepare to begin posting initial margin against non-cleared trades from September – Europe's postponement notwithstanding – many are fretting about the attendant rise in the cost of funding the collateral required to back those positions. This calls for an adjustment to derivatives prices reflecting the cost of funding the initial margin on a trade, known as a margin valuation adjustment (MVA).
Some large dealers already partially price in MVA on a handful of cleared trades where the cost is material. But once the bilateral margin rules laid out by the Basel Committee on Banking Supervision and the International Organization of Securities Commissions comes into full effect globally, many argue that to continue largely ignoring MVA would be unwise. Under the US rules alone, the annual cost of funding the margin required is estimated by the Federal Reserve at $2.5 billion.
Naturally, this raises the question of how MVA can be estimated within a practical timeframe and in an accurate fashion. This is where things start to get complicated.
Initial margin can be calculated based on covering potential future exposures – generally using a value-at-risk or expected shortfall model – over a period of 10 days to a 99% confidence level. MVA in turn is calculated by taking an expectation of initial margin over the lifetime of the trade and applying a funding spread to it.
Calculating both requires Monte Carlo simulations, one nested within another – a method cumbersome by its very description, and dreaded by quants.
In May 2015, Chris Kenyon of Lloyds Banking Group, and Andrew Green, his then-colleague, now at Scotiabank, came up with a compression technique that pares the time required to perform portfolio revaluations at each time point significantly – and thus reduces the time to calculate the MVA.
However, given the pressing need to calculate and price the adjustment amid an onslaught of other business costs, some argue even quicker methods are needed - especially since MVA could become an important consideration in trading activity.
"Two layers of simulation take too much time, so people have to take shortcuts. In the end, you are not getting where you want to go, and you are getting an approximation anyway," says Wujiang Lou, a director in global fixed-income trading at HSBC in New York.
In this month's first technical, MVA transfer pricing, Lou proposes an approximation technique that could speed up pricing of the adjustment by approximating initial margin requirements using the delta sensitivity – the change in a derivative's value versus a change in the underlying – rather than going for a full-blown simulation of initial margin.
The delta approximation of initial margin can in turn be calibrated to real margin payments posted in the market, such as those based on clearing house methodologies or the standard initial margin model (Simm), or vendor models using a multiplier that scales the estimated margin to appropriate market levels.
Lou's method keeps the initial margin calculation and derivatives pricing processes separate, and provides a partial differential equation (PDE) for the pricing part. The existence of a PDE makes the MVA pricing process very quick.
"For pre-trade analytics and market-making, it is important to be able to calculate a reasonable approximation to MVA at the same time as pricing the trade, in real time. Using the PDE-based approximation proposed by the author, real-time calculation of MVA with live market data will be much easier to implement than the traditional Monte Carlo approach," says Alexander Sokol, New York-based chief executive of solutions vendor CompatibL.
The advantage of having a separate PDE to price is that most firms already know how to solve it, says Lou: "The derivative is priced using the PDE, so you can use finite differences to solve it – most banks have those kinds of models already."
The looming onset of the non-cleared margin rules makes the development of quick approximations of MVA such as the one provided by Lou even more significant; banks, even smaller regional players, will want something they can calculate fairly quickly as trades come through.
As newer XVAs get added to the mix, the computational challenges and associated costs are becoming harder to manage, especially for smaller players. Dealers have also started XVA optimisation drives in recent times, looking at all of the XVA effects of a trade before taking it on so that the business can stay competitive by reducing these costs.
All of these issues call for a massive ramp-up of efforts in the XVA space. Are banks prepared for the next wave?
Also out this month: Deriving derivatives, by Andrei Soklakov
- Libor leaders: ABP crafts blueprint for corporate Libor switch
- Libor leaders: how seven firms are tackling the transition
- Libor replacement: a modelling framework for in-arrears term rates
- Swaps data: a new era of competition in interest rate futures
- From memos to texts, algos fish for signals in-house