Thomson Reuters calls for a liquidity data group to be set up by the Basel Committee on Banking Supervision
LONDON – “Liquidity risk should be considered the ultimate operational risk rather than a standalone risk,” said Philippe Carrel, an executive vice-president in the trade and risk management division at Thomson Reuters at a briefing on liquidity risk in Canary Wharf in London on August 5. He also called for the Basel Committee on Banking Supervision to set up a liquidity data group to collect data on banks’ risk exposure and asset allocation. This data could then be published as a guide to allow banks to gain a better picture of the market.
Carrel’s assessment of current risk management techniques was scathing. He suggested everything already known about risk management should be discarded, stating Basel II is “out of the window” and that until now “risk was just reported, and clumsily reported at that”. Risk management “requires more quantitative and qualitative research, as systemic risks have been created where regulators forced banks to use the same methods of stress testing,” Carrel said.
Much has been said in the operational risk space on the pro-cyclicality of the advanced measurement approach to operational risk under Basel II. And in this regulators are pretty much stuck between a rock and a hard place. The internationalisation of financial services means regulators must work more closely together to properly supervise individual firms and in doing so they have developed and shared best practices for risk management. As a result, firms are gradually adopting similar approaches to risk, to meet regulatory approval. Which means that firms are all acting similarly in times of crisis. This goes beyond procyclicality.
During all of the conferences and speeches given by regulators during this first half of the year, it was made clear the regulators are aware of all of this and are moving towards plugging the gaps – but they are moving all too slowly for some and more guidance on effective risk management needs to be more swiftly forthcoming, if not from the regulators, then from industry vendors.
More on Risk Management
This issue of The Journal of Risk Model Validation commences with a paper by Pilar Abad, Sonia Benito Muela and Carmen López Martín entitled "The role of the loss function in value-at-risk comparisons",...
ABSTRACT We demonstrate how introducing economic variables into a credit scorecard improves the predictive power of that scorecard. Such a scorecard can forecast default rates accurately, even when economic...
ABSTRACT In this paper, we discuss the evolution of the financial economic events that preceded the triggering of the subprime crisis. This crisis shares similarities with episodes that occurred in the...
ABSTRACT This paper examines whether the comparison of value-at-risk (VaR) models depends on the loss function used for such a purpose.We showa detailed comparison for several VaR models for two groups...
Sign up for Risk.net email alerts
Oxford professor David Vines argues that the carrot is as important as the stick
Sponsored webinar: IBM
Watch highlights of this year's London conference
Operational risk and the challenges of defining and dealing with conduct risk
There are no comments submitted yet. Do you have an interesting opinion? Then be the first to post a comment.