Sponsored by ?

This article was paid for by a contributing third party.More Information.

Blazing new analytical paths: Tackling data aggregation for new risk insights

Blazing new analytical paths: Tackling data aggregation for new risk insights

As the risk function’s influence continues to grow within financial services firms, demand for quality integrated risk data to support a wider range of business-critical decisions is stretching the capabilities of existing technology to breaking point. A new platform for agile tools is needed to enable the analysis required and unlock strategic opportunities within the business

Financial services technology spending has ebbed and flowed over the post-financial crisis period, but two areas continue to see increasing investment: data and risk. Even as many of the most cumbersome regulations are now several years into implementation, and compliance budget has steadily flattened, major investment banks and asset managers continue to financially support these twin priorities. The primary reason? Deeper and wider application of risk analytics.

The drivers are well understood. Data volumes and consumption are up; so too is the variety of its sources and structure. Many firms muddled through the early post-crisis years by simply keeping up, stitching spreadsheets and pre-packaged reporting solutions together. New risk measures, capital charges and margin requirements – to say nothing of deeper reporting on enterprise-wide market and liquidity risk exposures – sometimes came and went without necessary data infrastructure upgrades or data governance frameworks to support the effective deployment of new analytics. There is unfinished business to address.

Ten years on, the mood has changed, and new strategic opportunities are there for the taking – a product of the steady rise of the risk function.

 

Dynamic and predictive

In an era when technological disruption is ubiquitous, firms are asking different, more subtle questions of their data in an effort to extract new value. Likewise, no longer consigned to quantitative users, risk professionals – and risk data and analytics – now drive how financial services firms behave, and even how they are organised.

As a result, a broader spectrum of interrelated groups – not only risk teams, but quants, traders, sales, finance and accounting teams, software developers and reporting operations – need to be able to perform complex analyses that span these notoriously messy datasets and analytic silos. 

Driven by boardroom support, they collectively aim to build analytical capabilities to fight a new wave of challenges, whether margin compression among investment managers or bloated operating and regulatory costs for investment banks, and do so in a more holistic way. 

That has raised expectations of risk datasets, and aggregation and analysis tools, as institutions push new investment strategies and products, study more efficient allocation of activities (and cost) among business units, and seek to unlock leverage faster – and more precisely – than before. Tools based on pre-aggregation will not be able to support this enhanced analysis.

“Today, risk data must open new analytical paths – driven by these more open-ended questions,” says Peter Chirlian, founder and chief executive of aggregation and analytics provider Armanta, acquired by IBM in 2018 and now part of IBM Watson Financial Services. “Risk is now a dynamic environment, from real-time assessment and pre-trade analysis, through to manipulating regulatory constraints and predicting the impacts from a proposed change.”

 

Road to reinvention

To get there, developing the right technology alchemy – geared towards architectural flexibility and performance to support a greater variety of user queries – is crucial. But this remains tough to achieve. Chirlian argues the challenge is down to one core issue: that risk analysis has “historically tended to resist big data principles” that have caught on elsewhere, and has therefore fallen behind.

Many firms find their risk data ecosystems – the foundation where compute power, user interface requirements, taskflow and microservices converge – are buckling under contemporary analytics’ new requirements. Several reasons explain this, he says. 

To start, solutions that rely on pre-aggregation still rule. Risk simulations and modelling have always been among the most computationally advanced and theoretically involved at any bank or investor, but by pre-supposing the questions that will be asked, these solutions limit the types of analysis that can be performed. 

Next, business processes, hierarchy and development practices also vary significantly between key enterprise functions, and even among trading desks – which can hamper a more widespread, collaborative approach to analysis. 

And third, many are keen to bring machine learning or other advanced artificial intelligence applications into play, but are frustrated to find the historical data required – often sitting trapped in silos or data lakes – has governance around it designed primarily with pre-aggregation and a restricted purpose such as a specific regulatory report in mind. The organisation wants to push forward, while analytical tools remain trapped in the past.

Therefore, today’s institutional priority – and millions in continuing investment – comes down to moving from reporting to true analysis. Whether for highly quantitative traders and risk-takers, executive management or risk and operations officers, it amounts to a fundamental shift in paradigm for the technology underneath.

Purpose-built grid computing and infrastructural clustering is required, capable of completing complex aggregations or ‘what-if’ analyses “within 10 or 20 seconds, not two hours”, says Chirlian. It must also be easy to integrate, pulling data from existing data stores and spanning those sources, whether on-premise or – increasingly – in the cloud. 

“Once you break down those pre-aggregation barriers, you open a far wider analytic world,” he explains. “Your risk platform really needs to be self-service – flexible and truly capable of computation on the fly.”

 

Expanding what-ifs

This new perspective is also reflected in the changing contours of regulatory reporting. For example, new modelling for the Fundamental Review of the Trading Book (FRTB) has evolved to become more business-focused. Higher-order questions such as restructuring the business for optimal capital treatment, or avoiding penalisation by realigning the trading desks, are now squarely in play.

This new reality – an ongoing, institutionalised experimentation – demands massive operational change. For instance, manipulating potential trades, and even modifying the legal entity hierarchy of the bank to understand how capital constraints might be affected under new regulatory conditions, is complex and compute-intensive. 

Modelling dependencies must be managed, including tracking computational changes and parameterisation. Later in the process, additional analysis requires data lineage to understand the source systems feeding data into those outcomes. And yet many legacy risk data platforms are incapable of carrying that load, or tracking these outcomes effectively.

It is proof that “today’s data aggregation must go well beyond a vendor-prescribed set of use cases”, Chirlian says. “Narrowly squeezing analytic paths into a multidimensional online analytical processing cube won’t suit. You need a mechanism that goes beyond the traditional sense of aggregation. Whether regulatory requirements such as FRTB and International Financial Reporting Standard 17, or management reporting, risk teams always begin by asking ‘How do I explain these results, either internally or to a regulator?’“

That requires an analytical platform that can drill down, organise data, and ultimately answer those fundamental questions. “Until you can do that,” Chirlian concludes, “you may not know what direction your enquiry should go.”

About IBM Watson Financial Services

IBM is working with organisations across the financial services industry to use IBM Cloud, cognitive, big data, regulatory and blockchain technology to address their business challenges. Watson Financial Services merges the cognitive capabilities of Watson and the expertise of Promontory Financial Group to help risk and compliance professionals make better informed decisions to manage risk and compliance processes. These processes range from regulatory change management to specific compliance processes, such as anti-money laundering, know your customer, conduct surveillance and stress testing.

Learn more about IBM financial risk and regulatory compliance solutions, and follow @IBMFintech on Twitter  

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here