Risk.net convened a webinar in collaboration with Murex to explore how, as more financial institutions move to the cloud, they can get the most out of their technology investments
- Arnaud de Chavagnac, Head of cloud solutions marketing, Murex
- David Saul, Senior vice-president and chief scientist, State Street
- Moderator: Nazneen Sherif, Staff writer, Risk.net
More and more financial institutions are going to the cloud for their technology needs. This burgeoning migration promises to transform hardware-intensive processes vital to effective risk management, such as pricing calculations, regulatory reporting and disaster recovery.
But moving in-house hardware and legacy systems onto cloud-based applications carries challenges as well as gains. Firms are concerned about data security, for example, when pursuing the opportunities for speed, agility and consistency that cloud technology promises.
A poll conducted during the Risk.net webinar confirmed that many firms are only just beginning to consider cloud adoption. State Street, however, has been using cloud technology for at least six years. The US bank began by building its own private cloud data centres, providing insight into which applications are best suited to the transition.
“Our initial driver to use the cloud was to create a standardised environment to speed up the time to develop, deploy and retire applications,” said David Saul, senior vice-president and chief scientist at State Street. “The primary criterion we’ve considered is whether data is sufficiently well protected for our clients and ourselves. Our test strategy has been careful to maintain control over production‑level data.”
It’s never too late
Saul suggested that firms just beginning to experiment with cloud are in prime position to benefit from the trials and tribulations of earlier adopters and improved products from vendors.
“External cloud providers have greatly matured their offerings to include a lot more capable services. Software services that previously you might have had to build yourself are now offered by the cloud provider,” he said. “Rather than thinking you’re late to the party, consider that by avoiding being a pioneer you’re able to benefit from the experiences of those that came before you.”
Cloud’s coming of age has led to accelerating adoption, observed Arnaud de Chavagnac, head of cloud solutions marketing at Murex, a front-to-back-to-risk software provider: “It’s only relatively recently that we’ve seen many of our customers going to the cloud globally for mission-critical systems on the buy side and on the sell side. It’s happening now because of great results already achieved and great improvements performed by the cloud vendors working with other industries on other applications.”
He added: “Now all of this can be leveraged in this capital markets segment. When you see the stability, the scalability, the security and the cost savings, many more recent investments have been made possible thanks to some other sectors that have already adopted these techniques.”
Saul praised the efforts of the Cloud Working Group, part of the Object Management Group. The independent body has grown in five years to around 600 organisations in different sectors sharing best practices. “It’s an invaluable information resource. Our organisation has contributed to it. I suggest exploring that, and if you’re not already a member you should join,” he said.
Within the broad migration to the cloud, de Chavagnac stressed that financial services clients are pursuing different approaches, for variously weighted reasons and at differing adoption rates.
“On our side, we need to make sure we give enough options so we can cope with the different constraints that clients face, for example in the demands of local regulators, and what they have might have already done in the past with their other systems,” he said.
De Chavagnac observed that many firms were proceeding in phases. Companies tend to migrate selectively, he noted, rather than move production data to the cloud in one swoop. “They will first test the results with certain parts of the installation,” he said. “For example, a Murex application may mix elements run on-site and those on the cloud. It’s important to match the maturity of the firm with the cloud subject, and to understand the constraints they may face from auditors and regulators.”
Saul implored would-be cloud migrants to focus on their own prior experiences. “Use your experience,” he said. “You don’t have to create something totally new for the cloud if you already have good practices for managing data centres and deploying servers and applications. Take those policies and procedures and adapt them if they need to be adapted, but it is the experience you already have that will stand you in good stead as you move to the cloud.”
De Chavagnac illustrated the point with an example with a recent customer. “We had a cloud workshop with one of our customers a few weeks ago, with participants from many departments that all needed to be involved for cloud migration,” he said. “Most of the skills needed already existed within the firm and were in the room – for the operating systems, for the database, for the applications and for security – with Murex as the software provider and cloud vendor also present. When they are involved right at the beginning, you have the assets in place to migrate relatively quickly and smoothly, and you are in a good position to start your cloud journey.”
A poll conducted during the webinar revealed respondents’ priority applications for adopting cloud technology. Risk management, research and development, and pricing calculations figured highly. Disaster recovery, perhaps surprisingly, lagged behind.
Saul thought the research and development focus matched his own experience of the testing phase, including selecting cloud providers to match the company’s infrastructure, software and environmental factors.
“It encourages effective development and experimentation,” said Saul. “We’ve tied that in with creating test data that can be used in different environments, eliminating the concern about use of production data.”
Keeping disaster recovery systems up to date is an advantage of cloud, he explained. Duplicated in-house disaster recovery systems might not be updated consistently, for example, with the latest software patches, but cloud offers consistency in this respect, Saul suggested.
“In a cloud environment, where your disaster recovery and your production environment are being used interchangeably, you not only lower the cost because you don’t have hardware sitting idle, but you also eliminate the problem of keeping things up to date because everything is maintained at the same level,” he added.
Flexible access to capacity as required saves on server inactivity or being overworked at peak demand, noted de Chavagnac. A range of factors that are difficult to predict has often meant estimates have been inaccurate, adding risk or wastage.
“If you know when you will need this extra capacity, then you will use it when you need it, and you will only pay for it when you use it. You can monitor and adjust capacity more easily. You no longer need to predict the unpredictable.”
Cloud makes capacity planning easier by pooling it, Saul noted. In the past, organisations had to plan capacity for separate segments of processing, with multiple estimates for what was required to handle peaks for disaster recovery and other scenarios.
He added: “By bringing them together, we now have more assurance that our total number – since it can be reallocated instantaneously – is going to be able to handle peak situations, whether in production or disaster recovery. That makes the job of capacity planning much more straightforward than it was previously.”
Cloud’s pay-as-you-go access to capacity was cited as the greatest benefit of adopting the technology by respondents to a question polled during the webinar. Cloud’s scalability in this respect reaps benefits for developing new applications, Saul highlighted, with major time and cost savings possible once internal hardware capacity is no longer a limiting factor.
“Before cloud, a typical application development would get to the stage of sizing the amount of processing storage needed for development,” said Saul. “In that procurement process, hardware would be turned over to various infrastructure groups to build an operating system, a database or whatever is needed for the application development to proceed. In the best case, that could take weeks or probably many months.”
Cloud can transform this scenario, he explained. “Today, a standardised environment means deployment is mostly automated and happens in minutes. The time saving also enables greater experimentation to try new techniques or algorithms. If something doesn’t work out, the cost of changing it is negligible.” said Saul.
De Chavagnac has seen similar benefits from a regulatory compliance perspective, particularly for the Fundamental Review of the Trading Book
(FRTB) reporting requirements.
“For FRTB, for example, you need to manipulate a vast amount of data, as well as calculations across many scenarios and points in time,” said de Chavagnac. “This requires a lot of computing power to produce results in limited time. Certain customers are ready to do these calculations but are waiting for the computing capacity they ordered months ago. You need to be ready to run those calculations at all times, and for the figures you use to be consistent throughout the organisation.”
Amid toughening regulatory requirements, security has improved as technology has matured towards protecting data as well as providing robust disaster recovery, Saul suggested. However, a poll question revealed security is still perceived as the top barrier.
“We’re making progress, and it constantly needs to improve,” said Saul. “Security and regulation have so much overlap. Caution and reluctance to move has been understandable because we need to know regulators are comfortable with what we’re doing.”
Data residency requirements from some regulators, for example, mean checking with cloud providers where data physically resides, while other data can be moved more flexibly and kept in lower-cost locations.
“Security starts with the data, and the vital aspect is control over the level of access. Everyone at your organisation should be matched to appropriate access, and whether they are limited to reading data, or can make changes,” Saul said.
“We are all data-centric businesses,” he continued. “Data should be categorised by levels of protection. Think of it as concentric circles, with the crown jewels at the centre and the highest level of controls. That may be the information you never let off the premises. As you make your way through the expanding concentric circles, data in the cloud may require encryption as it moves and sits at rest. It may be public data, requiring fewer controls and associated costs.”
Cloud governance should be the same as internal processes, Saul emphasised. “If you try to create something entirely different from the cloud, you will waste the years of experience you’ve gained developing governance, and you’ll probably miss something,” he warned.
“External cloud providers will provide you with the certifications they comply with. Match those with what you already have, and you’ve got a good idea of what you can expect as you move data to cloud,” he added.
De Chavagnac focused on the team dynamic between cloud vendor, customer and application vendor. “You can’t achieve security alone. The important thing is that they are working hand in hand and know their responsibilities. The security experts should be part of the team from the beginning and throughout the journey,” he added.