If banks had been hesitant to move their data and processes to the cloud, recent regulatory requirements have certainly provided a final nudge in that direction.
The revised market risk regulation finalised by the Basel Committee on Banking Supervision at the start of 2016 – the Fundamental Review of the Trading Book (FRTB) – is expected to overhaul existing risk calculations at banks, making them more computationally demanding. One recent industry analysis estimates portfolio risk calculations could be expected to increase as much as tenfold.
It doesn’t help that pricing calculations are also becoming increasingly complex, requiring banks to factor in adjustments because of various costs associated with counterparty risk, funding, capital and initial margin – collectively known as valuation adjustments, or XVAs – into prices.
In the past, larger banks have thrown money at computational problems, investing heavily in their own data centres and advanced technologies such as graphics processing units. Banks that failed to join the bandwagon mostly did so because of restrictive IT budgets or legacy systems, which made the transition enormously challenging.
Cloud computing has been around for many years as a potential means of expanding the computing capacity of banks, but security concerns had largely kept most firms from taking the plunge.
However, that is about to change – whether the industry likes it or not. The additional strain on resources because of increased competition and regulations such as FRTB means banks are left with no other choice.
“There is no other way,” said one risk manager at a European bank earlier this year when the bank was starting to transition its FRTB and XVA calculations to cloud.
Three European banks have so far confirmed they have plans to use the cloud for FRTB and XVA calculations. One US bank is already using it to run stress tests. Some buy‑side firms are exploring its use in machine learning, portfolio optimisation and disaster recovery.
This all means that firms need to get with the times. They also need to become comfortable with the idea that their sensitive data and calculations will be hosted on an external platform that can, in theory, be accessed by anyone if security is insufficient – and the industry is far from reaching that level of comfort.
Security and compliance seem to be the major obstacles to more widespread adoption of the technology. Some vendors even argue cloud solutions have better security than in‑house security. However, many market participants are still hesitant to move their sensitive data to the cloud.
What the industry needs at this point is to put the issue of security in the spotlight and aim to build consensus between clients, vendors and regulators around how security can be managed. As with anything that tries to overhaul the way a certain business functions, this will take time and effort.
“This is as much a cultural challenge as anything else,” says Paul McEwen, group head of infrastructure and security engineering at UBS in London.
- People moves: SocGen adds in prime services, Deutsche fills new rates hole, HSBC makes model move, and more
- Credit risk quants are hitting the tech gap
- Princeton tops inaugural Risk.net quant master’s ranking
- Does credit risk need an expected shortfall-style revamp?
- Teach history to avoid mistakes of yesterday’s quants