IRB external data relief gains support
FRTB threatens popular hedge fund strategies
ESMA publishing commodity derivatives markets size
COMMENTARY: Data day
The importance of decent data when attempting to make rational decisions was made triply apparent this week across asset management, banking and commodities.
Along with other European regulators, the UK’s Prudential Regulation Authority (PRA) has started to endorse the idea of using external data to build internal ratings-based (IRB) models for credit risk-weighted assets (RWAs).
Following Basel II, banks had divided into those taking the IRB approach and those using the standardised approach (SA) for modelling credit RWAs. But a speech by Martin Stewart from the PRA called for the uneven field between IRB and SA banks to be levelled – partly by dealing with what he calls the “data conundrum”.
To qualify for regulatory capital relief, a bank has to prove it has been using IRB approaches internally for three years. But getting enough data to start with can be a challenge, as some portfolios don’t have much default data, and this is where external data can be helpful.
The European Securities and Markets Authority (Esma), meanwhile, plans to publish much-awaited data on the total size of commodity derivatives markets by July. The data is required to determine whether commodity firms will be in scope of the second Markets in Financial Instruments Directive when it becomes law next year.
To qualify for an exemption, in one part of the test – to calculate their own activity as a percentage of the overall market – firms must know the total size of the exchange and over-the-counter commodity derivatives market. Currently, such aggregated data doesn’t exist. However, when it is published in July, Esma has already warned it will be incomplete.
Data scientists at Morgan Stanley, meanwhile, are scratching their heads after observing a mysterious modal pattern in market data, which they attribute to systematic trading activity.
The time interval between trades of Sony stock did not exponentially decay as expected as the number of trades decreased. Morgan Stanley’s quants believe this reflects the presence of systematic and algorithmic traders in the market.
But the challenge for researchers is multiplied by the fact that trading algorithms are constantly adapting in response to changes in the market microstructure and clients’ specific execution needs. The growth of quantitative trading also forces firms to constantly refresh their algorithms – making it more difficult for researchers to identify patterns in the data.
STAT OF THE WEEK
If the Basel Committee’s cap on one of five categories in its systemic bank assessment methodology – ‘substitutability’ – is removed, in both 2015 and 2016, JP Morgan would have been pushed into the top bucket. This involves a sharp increase in the capital surcharge from 2.5% to 3.5%.
QUOTE OF THE WEEK
“We are not suggesting we will run the netting service at a lesser standard than we would run anything else, but the service will not be [deemed] systemically important, so is more suited to the use of distributed ledger technology at this stage. The product testing is very exhaustive and we are setting high requirements for it to reach before we move into production” – Tom Zschach, chief information officer at CLS.
- Quant Finance Master’s Guide 2019
- People moves: SocGen adds in prime services, Deutsche fills new rates hole, HSBC makes model move, and more
- Brexit threatens to reopen Asian bail-in clauses for EU banks
- Podcast: Kenyon and Berrahoui on the pitfalls of PFE
- Cross-currency swaps could hasten RFR shift in Australia