Banks make new push on FRTB’s P&L test
Industry calls for series of changes as regulators prepare new consultation, says Nomura’s Epperlein
Banks have renewed calls for changes to the new approval regime for market risk capital models, in an attempt to make one of its key tests easier to pass, and to reduce the impact of failure.
Some of the industry’s proposals were described by Eduardo Epperlein, global head of risk methodology at Nomura, who was speaking at the Quant Summit Europe conference in London yesterday (March 7). Without changes, Epperlein warned banks may abandon the use of models altogether, leaving them on the cruder, standardised approach to capital instead.
“I believe we are running a real danger that by making this so cumbersome, so difficult to pass, the industry may just throw in the towel and we will be left to manage our regulatory capital requirements through some sort of standardised approach,” said Epperlein, who noted he was offering personal opinions.
According to Epperlein, industry suggestions include removing data inconsistencies that could cause a model to fail; changing the methodology of the test itself; implementing a so-called ‘amber’ zone for desks that fail – rather than dropping them directly on to the standardised approach – and applying a temporary calibration period once the rules go live.
The International Swaps and Derivatives Association is co-ordinating industry lobby efforts on these four areas.
Banks’ criticisms of the regime are not new – and nor are some of the fixes – but the key regulatory working group at the Basel Committee on Banking Supervision has been under new leadership for the past year, and is understood to be working on a set of proposals that would overhaul the regime. Risk.net has previously reported that proposals are expected this month, or in April.
In the meantime, the committee has announced a three-year delay in implementation of the rules, which were originally scheduled for January 2019.
The committee’s Fundamental Review of the Trading Book (FRTB) requires each of a bank’s trading desks to individually pass a so-called profit and loss attribution test in order to use the internal models approach (IMA). The fallback in case of failure is the standardised approach, under which capital can increase dramatically, according to industry studies. For foreign exchange books, for instance, standardised capital is 5.3 times higher.
Right now… we know that, at the most, only a handful of desks will pass out of 50
Eduardo Epperlein, Nomura
The P&L attribution test requires two different measures of P&L to be compared – the hypothetical P&L generated by front-office pricing models, and the risk-theoretical P&L generated by the bank’s risk models.
The measures are compared in two different ways. One looks at the gap between the two using a mean ratio, and the other looks at the variance of that gap using a variance ratio. Both ratios should fall within certain established thresholds – if not, a breach is counted. Four breaches within any 12-month period will force a desk on to the standardised approach.
The test has attracted controversy since its introduction, with the rules initially describing two conflicting versions – a strong one and a weaker one. The strong version tells a bank to use its risk models to calculate its risk-theoretical P&L, while the weak version requires a bank to use the front-office model while applying the more limited set of factors that exist in the risk models.
One problem is that the two models will typically use different sets of data, meaning a desk could fail simply because of disparities in the underlying datasets. In his presentation, Nomura’s Epperlein argued banks should be allowed to feed the same set of data into both models.
“[We] still haven’t got confirmation from regulators that we can do that. If we can’t do that, then we know we are going to fail this test. Right now… we know that, at the most, only a handful of desks will pass out of 50,” said Epperlein.
Another major issue is the choice of the test metrics themselves, said Epperlein. The denominators used in the mean ratio and the variance ratio are the standard deviation and variance of the hypothetical P&L respectively. These can be very low for a well-hedged desk because P&L volatility will be low.
“Say you have a well-hedged desk… you will drive the denominator down dramatically. However, in the top, if you have a little bit of a mismatch, then you will easily blow up the ratio,” said Epperlein.
Industry proposals to fix this include using a better statistical measure such as the Kolmogorov-Smirnov test, which looks at the distance between the two distributions, and using rank correlations between the two, which is less sensitive to outliers.
Other key improvements, according to Epperlein, include having a so-called amber zone for banks that have failed the test – the idea being that the capital penalty would be reduced to avoid the cliff-effect of failing. A post-implementation grace period could also help set appropriate pass thresholds for the test before the rules become binding.
“The idea is that when we do go live, maybe we could be allowed a test period to make sure it works. We are not saying the test is a bad test. It’s a good test, but probably costly to implement and difficult to pass… the cost [is not commensurate] with the benefits,” added Epperlein.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe
You are currently unable to print this content. Please contact info@risk.net to find out more.
You are currently unable to copy this content. Please contact info@risk.net to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@risk.net
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@risk.net
More on Risk management
Squashing CVA still dominates XVA desks’ priorities
Dealers favour options-based strategies to manage charges; some explore contingent CDSs amid rising exposures
EU single portal faces battle to unify cyber incident reporting
Digital omnibus package accused of lacking ambition to truly streamline notification requirements
XVA desks prioritise core tech upgrades over AI
Vendor upgrades, cloud-native rebuilds and sensitivities tooling dominate 2026 budget road maps
Chicago data centre outage forced clearers to turn away clients
Friday’s cooling system failure highlights cracks in tech and concentration risk of big CCPs
LCH goes live with agency model for client clearing
English law version of FCM-style European trust model approved, as Eurex lags behind
US banks hoping for end of DFAST global market shock
As Fed consults on stress-test reform, lobby group argues regulator is double-counting market risk
NMRF framework: does it satisfy the ‘use test’?
Non-modellable risk factors affect risk sensitivity and face practical and calibration difficulties, argue two risk experts
Risk Awards 2026: The winners
Citi claims top derivatives prize, lifetime award for Dennis McLaughlin, JP Morgan wins equities