LONDON – “Liquidity risk should be considered the ultimate operational risk rather than a standalone risk,” said Philippe Carrel, an executive vice-president in the trade and risk management division at Thomson Reuters at a briefing on liquidity risk in Canary Wharf in London on August 5. He also called for the Basel Committee on Banking Supervision to set up a liquidity data group to collect data on banks’ risk exposure and asset allocation. This data could then be published as a guide to allow banks to gain a better picture of the market.
Carrel’s assessment of current risk management techniques was scathing. He suggested everything already known about risk management should be discarded, stating Basel II is “out of the window” and that until now “risk was just reported, and clumsily reported at that”. Risk management “requires more quantitative and qualitative research, as systemic risks have been created where regulators forced banks to use the same methods of stress testing,” Carrel said.
Much has been said in the operational risk space on the pro-cyclicality of the advanced measurement approach to operational risk under Basel II. And in this regulators are pretty much stuck between a rock and a hard place. The internationalisation of financial services means regulators must work more closely together to properly supervise individual firms and in doing so they have developed and shared best practices for risk management. As a result, firms are gradually adopting similar approaches to risk, to meet regulatory approval. Which means that firms are all acting similarly in times of crisis. This goes beyond procyclicality.
During all of the conferences and speeches given by regulators during this first half of the year, it was made clear the regulators are aware of all of this and are moving towards plugging the gaps – but they are moving all too slowly for some and more guidance on effective risk management needs to be more swiftly forthcoming, if not from the regulators, then from industry vendors.