Market Technology Awards 2018: New problems, new solutions

Vendors embrace cloud, alternative languages and agile approaches

Trophies

Market participants face a raft of new challenges. Vendors are trying to provide the answers – in many cases, using innovative processes, ideas and technology. Clive Davidson speaks to some of the winners of Risk.net’s Market Technology Awards 2018

Financial markets are beset by challenges on all sides. Whether it is complex and onerous new regulatory reporting, unprecedented market conditions, outmoded practices or more demanding clients, institutions are turning to technology for help. In some cases, start-ups are stepping forward with innovative solutions. In others, answers are coming from new tools and technologies – such as leading-edge programming languages – systems’ architectures and data analysis techniques. 

With banks under continuing cost and revenue pressure, simplicity and agility have become more important. The old model of delivering software like a huge jigsaw puzzle that takes months – if not years – to piece together, while expensive inflexible hardware to run it on is installed in the basement, is becoming less viable. Increasingly, vendors are delivering their software as a service, hosted either on a private network or via the cloud. 

And with the demand for less specialisation and more cross-asset and integrated functionality, new development techniques and platforms are emerging that enable institutions to build their own customised applications within standardised frameworks. 

In some instances, institutions are recognising that they are all duplicating functions to no competitive advantage, and are therefore looking at alternative collaborative solutions, such as utilities. 

Meanwhile, there are some areas that are becoming of more critical importance, but where only a few vendors so far have had the understanding and foresight to build solutions. All these trends are visible in this year’s Risk.net Market Technology Awards.

Onerous regulations

Foreign exchange is an area ripe for innovation. An absence of volatility, a new global code of conduct and increasingly onerous reporting requirements – particularly the revised Markets in Financial Instruments Directive (Mifid II) – are driving forex players to look for improved transaction cost analysis and best execution.

“Whether you are an active or a passive fund, if you can save even a handful of basis points on your execution, it has a big impact on the bottom line,” says Pete Eggleston, co-founder and director of BestX, winner of the best execution product of the year.

BestX has taken established techniques from equities and added new twists for forex. Instead of just measuring execution for individual trades, the company enables institutions to implement comprehensive workflow around their best execution policy, from configuring its key execution factors, to the priority or weighting it gives to them. The company has also brought real-time operation to forex transaction cost analysis – its software can analyse a transaction in under three milliseconds through the use of cloud computing – and allows users to investigate their transaction data and quickly produce reports for various stakeholders. 

Meanwhile, the narrowing of spreads and lower volumes in Group of 10 currencies has made emerging markets look more attractive to many institutions. However, those wanting to increase their emerging market currencies activity have faced frustration. Banks have been reluctant to add the currencies to their electronic forex platforms in the absence of a reliable price feed. There has been a lack of liquidity, with few traders, and little transparency as the market has been mainly voice-traded. Furthermore, the split between over-the-counter and exchange-traded activity required separate systems and processes.

“Traders just want to trade, irrespective of whether the prices are over-the-counter or on exchange,” says Jon Vollemaere, chief executive of R5, which he set up to create a solution – a new platform that sits between over-the-counter and exchange trading, supporting prices from both. 

Computer code

The platform, R5FX, also offers a choice of central credit or clearing, and allows participants to use ‘prime clearing’, whereby they can trade with counterparties they do not have direct credit with and have trades cleared via a prime broker. 

“That really opens up the emerging currencies, as participants can now trade freely with each other and lower risk at the same time,” says Vollemaere.

Now, with the availability of electronic trading, increased transparency and simplification of trade processing, emerging market forex is set to follow the same trajectory as trading in G10 currencies, with aggregators and algorithms increasing market dynamism.

Fintech firms such as BestX and R5 face a challenge: they need to bring a product to market quickly, but have limited resources. The typical development process in capital markets is for quants to code up models in statistical or mathematical languages such as R or Matlab, or a development language such as Python, and then hand the model over to IT specialists who rewrite it in a language suitable for production, such as C++.

“As a small company, we didn’t have the resources to rewrite all our models, but we needed to get a minimum viable product out quickly,” says Eggleston of BestX. So the company turned to Julia. 

“Julia is a new language that allows us to put together models quickly because it is similar to Python, in that it is a high-level language, but it has the performance benefits of something like C++. It means we don’t have to recode everything – it has been a massive accelerator for us,” says Eggleston. The company is now working closely with the founders of Julia on its continuing evolution, while also making use of other advanced languages, such as TypeScript and Q.

Alternative treatment

Another fintech firm, Beacon Platform, offers an alternative remedy for the inefficiency of the two-step development process. “A problem we often see on both the buy and sell sides is that although the quants understand the business and are good at solving problems in spreadsheets or Python, they aren’t as good as the IT department at turning these into enterprise production systems,” says Mark Higgins, co-founder and chief executive of Beacon Platform.

Quant ad hoc solutions tend to take on a life of their own, becoming a complex web of system extensions that is a nightmare to maintain, painful to change, and sometimes impossible to even understand. The firm says Beacon avoids this by providing workflow tools for translating code from development to production in a consistent, controlled and maintainable, but flexible, way. The platform also aims to address one of the most pressing issues currently facing capital markets’ chief technology officers. 

Quant ad hoc solutions tend to take on a life of their own, becoming a complex web of system extensions that is a nightmare to maintain, painful to change, and sometimes impossible to even understand

As one of the awards judges put it: “Banks today are trying to get rid of their specialised pricing and valuation systems, and want services that cover all asset classes.”

“Now, instead of asking vendors what specific asset class libraries they have, banks want to know how the vendor is bringing asset classes together, how it gathers data from various sources, so it can be accessed by a centralised set of analytics libraries, how it might use machine learning to analyse the data, and how its analytics libraries can be used on a cloud platform.”

Beacon says its flexible development and production platform can be used to supplement a bank’s current infrastructure or to build new cross-asset functionality from scratch. It allows users to plug in multiple analytical libraries, and also provides a set of financial object models. 

“We have just one right way to represent an interest rate swap, for example. It can be called by the interest rate library – or by the equities desk or anyone else if they want to price an interest rate,” says Higgins.

Restricting client access

Beacon also addresses another emerging issue. “Banks are now trying to export as much as they can of their internal technology and analytics to their clients, and are finding it really hard. Their internal systems aren’t configured to do it, and they don’t want to just give unfettered access to their internal environment to clients,” says Higgins. 

With Beacon’s platform, banks can spin off securely segregated environments that allow clients to access the bank’s analytics and data, as well as incorporate their own data without the bank being able to see it. 

Beacon, like BestX and many of the other award winners, is exploiting the advances of cloud computing to offer its software as a service. The arguments for SaaS and cloud are now compelling – and they are not all about cheap computation. 

Cloud computing

StatPro was one of the early adopters, deciding in 2008 to develop a new performance and attribution application specifically and only for cloud. At the time, it was facing a set of issues familiar to most software vendors. Over 300 clients were using its traditional on-premises application, often with versions they had customised to their particular business requirements. Supporting and upgrading these numerous and varied clients was a burden – and costly and time-consuming for the clients, as well – and constrained the degree to which the company could grow its user base. 

“It meant our clients weren’t getting the best value from us because when we released a new version of our product, it sometimes took up to 18 months to get the version installed at the end-user due to IT project constraints and delays,” says London-based Neil Smyth, marketing and technology director at StatPro.

The cloud shrinks the footprint of an application down to a single version on a single platform, freeing up both vendor and client IT staff, who can then be redeployed – in the case of the vendor, on application development. And it allows the vendor to introduce new versions more frequently.

“Like most traditional software vendors, we used to do one big software release a year. That big dump of software is hard for clients to digest – it takes a lot of testing, and is expensive. With cloud, we can deploy multiple releases a year, with less requirement for user testing. It is a much more streamlined way of delivering software – we can go five times faster, in terms of releases, than we could before,” says Smyth.

Cloud has limitations

The cloud does not override the need for good functionality. Survey after survey shows that meeting business requirements remains institutions’ number one criterion when choosing software. Again, the cloud can offer an advantage, Smyth argues. 

“Cloud delivery has enabled us to expand and accelerate the development of functionality for our platform more than we could ever do with our traditional product because we have so much more development time available,” he says.

The cloud also ticks the boxes of institutions’ other main selection criteria – cost, ability to handle large data volumes, scalability and ability to integrate with other infrastructure. 

But the cloud is not without limitations and challenges. “You can’t just port an on-premises application to cloud – you don’t get the benefits of a native cloud application that way,” says Smyth. Also, institutions have customised their installed software for good reasons – to adapt it more closely to their requirements and business operations. By operating just a single version of the software, cloud removes this possibility. 

With cloud, you can’t easily customise, so your software needs to be flexible with wide configuration options

Neil Smyth, StatPro

“With cloud, you can’t easily customise, so your software needs to be flexible with wide configuration options,” says Smyth. Essentially, it means understanding and anticipating as much as possible the individual requirements that would formerly have manifested as local tailoring, and building them into the cloud application as options. Providing application programming interfaces through which users can link to the cloud-based application and extract data also helps offset the loss of local control.

Higgins of Beacon highlights another advantage of the cloud – it allows businesses to experiment with computationally demanding problems quickly and cheaply. Take a prototype application that might require 1,000 microprocessors to test: in a traditional setup, the quants request the budget for the computers, wait for them to be delivered and configured, and perhaps six months later get to try their idea. If it does not work, the organisation is stuck with the expensive hardware. With a cloud-based platform, the quants request 1,000 cores, and have them available within minutes. 

“The quants can run their experiment, and only pay for the computing time they use. They get to do it quickly. And if it works, they can scale it up for production. And if it doesn’t, they just turn it off,” says Higgins.

Sharing: the logical solution

There are a number of operations in the industry that are either primarily about communicating and sharing information, or common across all players – in these cases, there may be little or no competitive opportunity in how they are carried out. The logical solution, especially where there is already well-established practice and where relevant regulations are well understood, is a shared utility service. A number of such utilities are already established in areas such as clearing, messaging and sharing ‘know-your-customer’ data. But there are fewer than might be expected, partly because institutions until now have been able to bear the costs of duplication, and because utilities come with their own problems. However, every so often, a challenge emerges where the arguments for a utility approach are overwhelming. 

AcadiaSoft Hub is a utility created by a bank collective, in response to the new margin rules on non-cleared derivatives. In an era of tight budgets and a drive to reduce technological complexity, the banks were keen to avoid the cost of individually interpreting the rules and building the infrastructure to implement them. But this meant agreeing and standardising operational procedures – which has been a major stumbling block for many utility initiatives. 

Working with the International Swaps and Derivatives Association, AcadiaSoft convened nine working groups, eventually involving over 900 people from the participating banks, to thrash out how the process of calculating, reconciling and exchanging sensitivities and margin calls between market participants should work. It also required the creation of a brand-new file type, the Common Risk Interchange Format, to enable the exchange of standardised risk information. 

One of the short-term challenges in creating the hub has turned out to be a long-term advantage. Because the timetable to introduce the new rules was so tight and the regulators continued to modify the rules up to the last minute, AcadiaSoft was forced to create a flexible architecture that could be implemented and tested, but still accommodate change. 

“It took more investment than just building a system without that level of flexibility. However, now we have an application that not only serves the firms subject to the margining rules, but we are able to make changes and introduce new features quickly, including things that are not necessarily to do with the rules, such as adding more operational efficiency,” says AcadiaSoft chief executive Chris Walsh. 

Utilities usually interact with in-house systems that handle part of the process. These ‘unilateral implementations’, as Walsh calls them, are often provided by third-party vendors. An interesting prospect with the AcadiaSoft Hub is that as it expands its services, it may make some of these systems redundant. 

“We have a hub that interconnects all the unilateral implementations and provides a set of shared services. We believe that there are some classes of clients that, by embracing all the services on the hub, over time may have a lesser need for separate unilateral services, particularly those that are operated in-house,” says Walsh. 

Stress-testing regimes

The judges noted that, in some other critically important areas, the industry is not yet well served by technology solutions. Stress testing is one, and economic scenario generation another. Beyond the current regulatory stress-testing regimes, such as those of the UK’s Prudential Regulatory Authority or the European Banking Authority, the introduction in 2018 of the International Financial Reporting Standard 9 for accounting for financial instruments will increase the need for enterprise-level tools for these tasks. 

“When you look at the implications of IFRS 9, it will become even more important for institutions to calculate their capital accurately and efficiently, yet there are very few enterprise solutions available in this space,” remarked one judge.

Burcu Guner
Burcu Guner, Moody’s Analytics

One reason for the market being slow in coming up with enterprise solutions for stress testing is the tendency for institutions to take a tactical approach to the problem, says Burcu Guner, Europe, Middle East and Africa stress-testing practice leader at Moody’s Analytics. Although this enables institutions to meet the immediate requirements of authorities, “a tactical approach will become less sustainable over time as requirements increase, as well as the frequency of testing and the need for more real-time analysis. This will encourage institutions to think more strategically about stress testing”, says Guner.

Stress testing requires more than just software to run scenarios. It needs analytical models (such as credit models, loss estimation or risk-weighted asset forecasting), data (internal and external) and reporting capabilities. A strategic approach will become more critical with the introduction of IFRS 9, which requires banks to produce forward-looking term structures of risk to estimate lifetime expected losses. The commonalities with stress testing in terms of process, models and data cries out for harmonisation and shared facilities, says Guner. 

Moody’s Analytics is one of the few vendors so far to have had the foresight to design its stress-testing solution with the flexibility to straddle financial and accounting regulatory regimes, as well as support strategic business planning.

Stress testing, like many other regulatory responses, generates enormous volumes of data. Having spent significant time and resources producing it, many institutions are wondering if there might be further value hidden in their information warehouses – and if so, how to find it. Vendors are reaching for tools such as data mining, visualisation and, increasingly, machine learning. Axioma, BestX and Beacon are among the award winners that are incorporating machine-learning techniques and tools into their applications. But although such facilities can be powerful and bring valuable insights, they are not always appropriate.

Granular data

When Droit Financial Technologies looked at how it could help institutions quickly sift through masses of new market rules, including Mifid II, to assess the eligibility and obligations of a proposed transaction, it eschewed machine learning. 

“When institutions take decisions about their transactions, they have to be able to explain them in a granular manner, and you can’t do this with statistical ‘black box’ machine-learning techniques,” says Satya Pemmaraju, co-founder and chief executive of Droit. 

Droit’s software picks up transactions as institutions originate them, or as they come into the institution for quotes, and runs them through its database of digitised rules to determine which regulatory regime(s) the transactions come under and identify the obligations of both counterparties, pre- and post-execution. “Machine learning is fundamentally statistical, whereas in the world of transactional regulatory compliance, you have to be completely deterministic and correct 100% of the time. And you have to be able to explain why you decided a particular transaction did or did not need to be reported under Mifid II, and show the logic and data that you used. That is crucial in any system that makes decisions, and will become even more so in the future,” says Pemmaraju.

Elsewhere, vendors are drawing on techniques from artificial intelligence and other disciplines to help their clients interpret their markets. BestX, for example, is working with University College London to see if earthquake- and tremor-modelling techniques can provide insights into liquidity shocks and ‘flash crashes’, which appear to be happening more frequently in the forex market. “We can’t predict a flash crash, but we want to see if, once the crash has happened, we can model the impact on liquidity across the market for the rest of the day,” says BestX’s Eggleston.

Overall, the awards demonstrated excellence and innovation in bringing technology to bear on today’s market challenges. Perhaps surprisingly, however, the awards did not feature the technology that has probably been most talked about recently – blockchain and distributed ledger. 

“There’s are a lot of cost saving to be had by looking seriously at blockchain and distributed ledger technology as a way to manage credit latency and credit aggregation,” says R5’s Vollemaere. 

His firm has designed its platform and central credit and clearing model to allow it to add such services as they come on stream. Other vendors are also looking at how they might incorporate such developments into their architecture.

Introducing the awards – and this year’s judges

Readers of Risk.net may remember the long-running ranking of technology vendors that was published annually in December. These awards replace them. Like the rankings, they are a way of reflecting the contribution made to our markets by technology vendors and highlighting some success stories. The methodology, though, is completely different. 

The rankings were created by splitting the risk technology market into 30-odd product categories. 

Winners were selected on the basis of an online poll – and, with some restrictions, the poll was open to all. 

This approach had the great advantage of appearing very democratic. It also had the great drawback of favouring established firms – those with large, existing customer bases that would vote for them. A perennial complaint from younger, smaller vendors was that they had little chance of recognition.

In the current environment of technology change and innovation – where so many smaller firms are making a contribution – the poll’s weaknesses outweighed its strengths.

The Market Technology Awards set out to be more inclusive and meritocratic. Firms that wanted to be considered were invited to submit short pitch documents, explaining what made them different from their rivals, and showcasing changes they had made to their software in the preceding 12 months. These pitches were shared with a panel of 13 judges, assembled by the Risk.net editorial team. The panel convened in October to discuss the pitches and pick their winners.

The resulting roll of honour contains many first-time winners, and a mix of big and smaller firms. 

The new approach is not perfect – good firms could be overlooked on the basis of a bad pitch, for example. However, we’re open to suggestions – please send any to: duncan.wood@infopro-digital.com.

The judging panel for the Market Technology Awards comprised:

  • Sean Coppinger, head of risk technology, Standard Chartered Bank
  • Sid Dash, research director, Chartis Research
  • Clive Davidson, contributing editor, Risk.net
  • Ian Green, chief executive and co-founder, eCo Financial Technology
  • Elly Hardwick, head of innovation, Deutsche Bank
  • Simon Lumsdon, head of technology, Hermes Investment Management
  • Stéphane Malrait, global head of e-commerce and innovation for financial markets, ING Bank
  • Brad Novak, managing director, chief technology officer for Barclays’ investment bank
  • Ray O’Brien, global risk chief operating officer and head of global risk analytics, HSBC Group Management Services
  • Hugh Stewart, research director, Chartis Research
  • James Turck, head of global markets architecture and core engineering, Credit Suisse
  • Tom Wilson, chief risk officer, Allianz
  • Duncan Wood, editor-in-chief, Risk.net.

A full PDF of the Market Technology Awards 2018 print issue may be downloaded here

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe

You are currently unable to copy this content. Please contact info@risk.net to find out more.

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here