Banks must break data silos to improve pricing decisions
Data consistency is increasingly key to judging risk and reacting quickly in a crisis, writes former XVA practitioner
After a decade or so of slim pickings, where revenues slumped in the face of a sluggish global economy, enhanced oversight and new competition, double-digit returns on equity have returned to markets divisions as interest rates have backed up abruptly alongside heightened geopolitical tensions.
But despite significantly improved results for some, banks’ trading and markets divisions are under pressure from all sides. Navigating a profitable path forward is going to require firms to take a sharper look at whether they are being correctly compensated for the risks they are running and the costs they incur.
Central clearing of derivatives, and the related introduction of bilateral margin rules, together with guidelines designed to strengthen banks’ risk data aggregation and counterparty credit risk management capabilities, have all contributed to a post-GFC increase in the complexity of pricing bilateral derivatives, and the number of data-dependencies required to support them.
The consequent need for operational certainty in managing and using data highlights two big themes driving effective risk management and control. The first is the importance of first-mover advantage, where pricing, financial resource, and crisis management all demand that data is seen as an essential asset.
Determining whether to take down or lay off risk requires you to accurately calculate your costs of risk – for example, credit, capital, funding or collateral – to then be able to form an accurate opinion on price.
Equally, from a cost-saving perspective, the financial resource management function will be looking to direct hedges towards the cheapest cleared versus bilateral choices, where clearing is not mandatory. It will also be looking to use complete netting sets, network effects and third-party vendors to further reduce bilateral risk and costs.
First-mover advantage slows down dramatically if a firm can’t be sure of its own numbers when decision-makers from different units come together
In times of crisis, there is no substitute for timely information – for example, knowing your market and legal risk positions allows firms to react swiftly, and with confidence.
All these elements are complicated because of the large number of inputs required to work them out. Regulations have emphasised the need for accuracy, completeness and timeliness of those inputs, but the degree to which departments and functions collaborate firm-wide determines consistency.
Allowing several versions of the truth to circulate within different functions simultaneously not only forces teams to spend their valuable time internally debating facts rather than issues, but in practical terms it will eventually drive a bill for costly duplication of effort, reconciliation and missed trading opportunities.
First-mover advantage slows down dramatically if a firm can’t be sure of its own numbers when decision-makers from different units come together. The consequences may not be felt in calmer times, but when markets get stressed, the pain can be acute.
Questions around obtaining and integrating crucial data should not limit management responses, or indeed imagination. But if consistency is a concern, it leads to questioning of the assumptions that underpin the scalability of bank’s systems and processes in being able to find, and then push, the right data to the right decision-making points in the organisation.
Silo (busting)
Historically, banks were typically organised around individual product lines, with cross-business line collaboration not explicitly incentivised. Indeed, over the last couple of decades, most banks have found themselves persisting in this bottom-up approach, solving for immediate burning bridges according to the needs of each specific function or individual regulatory directive.
While this siloed approach allows units to tick off their own compliance obligations, aligned to their department’s policies and procedures, it does little for the firm-wide view. Too often individual teams produce inconsistent results when mapping their own data sources together, even where the underlying data itself remains identical between departments.
With bank-wide strategies subordinated to the will of individual functions, legacy infrastructures tend to be complex and inflexible, with hard-baked dependencies spinning up expensive cottage industries to reconcile data across teams, which themselves have become obstacles to change.
Worse still, without individual units seeing a performance issue in their own backyard, and therefore nothing to ‘fix’, the business case for prioritising infrastructure change becomes, and has become, utterly incoherent for most financial institutions. The altruistic will to effect lasting change for the greater good rarely earns a promotion.
Inconsistencies resulting from siloed behaviour have ended up leaking into firms’ day-to-day performance and are clearly a source of frustration for both senior managers and regulators.
Persistent operational risk incidents are painful, even more so where they invite explicit regulatory scrutiny and remediation; whereas the effect of other issues, such as new risks in the market that are missed and therefore not adequately priced, will be felt over time.
Some practitioners have observed that contract risk – the risk that crucial terms embedded in increasingly complicated collateral documentation are not properly reflected in derivatives pricing – is likely to be the next battleground beyond those considered in existing valuation adjustments.
The realpolitik is that if firms don’t pivot into a focused data-centric strategy, both culturally and practically, they are likely to find themselves sliding towards the back of the pack. Given the pressure on banks’ trading and markets divisions just now, it’s time to sharpen up the pens on identifying gaps in first-mover advantage.
Nick Sainsbury was a managing director in Credit Suisse’s counterparty portfolio management team. This piece has been authored by him in his role as co-founder of consulting and software provider Aptonomy (www.aptonomy.io). Co-founders Philip Staddon and Alex Beddoes also contributed to the editing.
コンテンツを印刷またはコピーできるのは、有料の購読契約を結んでいるユーザー、または法人購読契約の一員であるユーザーのみです。
これらのオプションやその他の購読特典を利用するには、info@risk.net にお問い合わせいただくか、こちらの購読オプションをご覧ください: http://subscriptions.risk.net/subscribe
現在、このコンテンツを印刷することはできません。詳しくはinfo@risk.netまでお問い合わせください。
現在、このコンテンツをコピーすることはできません。詳しくはinfo@risk.netまでお問い合わせください。
Copyright インフォプロ・デジタル・リミテッド.無断複写・転載を禁じます。
当社の利用規約、https://www.infopro-digital.com/terms-and-conditions/subscriptions/(ポイント2.4)に記載されているように、印刷は1部のみです。
追加の権利を購入したい場合は、info@risk.netまで電子メールでご連絡ください。
Copyright インフォプロ・デジタル・リミテッド.無断複写・転載を禁じます。
このコンテンツは、当社の記事ツールを使用して共有することができます。当社の利用規約、https://www.infopro-digital.com/terms-and-conditions/subscriptions/(第2.4項)に概説されているように、認定ユーザーは、個人的な使用のために資料のコピーを1部のみ作成することができます。また、2.5項の制限にも従わなければなりません。
追加権利の購入をご希望の場合は、info@risk.netまで電子メールでご連絡ください。
詳細はこちら コメント
GenAIガバナンスにおけるモデル検証の再考
米国のモデルリスク責任者が、銀行が既存の監督基準を再調整する方法について概説します。
マルキールのサル:運用者の能力を測る、より優れたベンチマーク
iM Global Partnersのリュック・デュモンティエ氏とジョアン・セルファティ氏は、ある有名な実験が、株式選定者のパフォーマンスを評価する別の方法を示唆していると述べています。
IMAの現状:大きな期待と現実の対峙
最新のトレーディングブック規制は内部モデル手法を改定しましたが、大半の銀行は適用除外を選択しています。二人のリスク専門家がその理由を探ります。
地政学的リスクがどのようにシステム的なストレステストへと変化したのか
資源をめぐる争いは、時折発生するリスクプレミアムを超えた形で市場を再構築しています。
オペリスクデータ:FIS、ワールドペイとのシナジー効果の失敗の代償を支払うことに
また:ORXニュースによるデータで、リバティ・ミューチュアル、年齢差別訴訟で過去最高額を支払う;ネイションワイド、不正防止対策の不備。
東京の豊富なデータが市場への影響について明らかにすること
新たな研究により、定量金融において最も直感に反する概念の一つが普遍的であることが確認されました。
資金調達コストの配分:集中型 vs 分散型
サチン・ラナデ氏は、特に担保付融資において、集中化は資本効率と自己資本利益率(ROE)の向上に寄与し得ると述べています。
Collateral velocity is disappearing behind a digital curtain
Dealers may welcome digital-era rewiring to free up collateral movement, but tokenisation will obscure metrics