There are more questions than answers when it comes to FRTB. But banks cannot wait for clarification – they must stop running parallel processes and press ahead with their preparations, says Scott Sobolewski, principal consultant at Quaternion Risk Management.
Since the finalised rules were published in January 2016, the Fundamental review of the trading book (FRTB) has created numerous questions requiring further clarification from regulators. These include: strong versus weak forms of risk-theoretical profit and loss (P&L); whether valuation adjustments such as independent price verification or prudent valuation adjustment should be within the scope of the P&L attribution test; timing differences from a single global close versus local P&Ls; the location of the regulatory capital floor; and the potential loss of adjoint algorithmic differentiation (AAD) in calculating risk sensitivities – the list goes on.
The Basel Committee on Banking Supervision’s Trading Book Group will continue to clarify many of these outstanding questions through industry surveys, meetings, conference calls and FAQ publications until the January 2019 implementation date. But, amid the industry’s confusion with FRTB’s finer points, banks must continue to forge ahead with preparation on both qualitative and quantitative fronts: structural reviews of individual trading desks; cost/benefit analyses for pursuing the internal model approach (IMA); and decisions made on upgrading existing IT infrastructure or starting afresh with completely new systems.
To receive approval for an IMA model in advance of the implementation deadline, regulators require at least one year of documented model performance data. Therefore, banks must start collecting model data by the beginning of 2018, necessitating development, validation and management sign-off by the end of 2017. Even the standardised approach (SA) – considered the fallback option for banks or individual desks unable to pursue an IMA – substantially increases the calculation burden from previous Basel III iterations in the form of a sensitivities-based approach, default risk charge and residual risk add-on. The SA will likely require IT investment for banks currently unable to calculate delta, vega and curvature risk factor sensitivities.
An integrated approach
At its core, FRTB regulation is an attempt to unify the front office and risk management by ensuring that all risks driving reported P&L – including those considered non-modellable – are accounted for in the processes that measure capital adequacy and risk reporting. Regulators have stressed the importance of an independent risk management function since the implementation of the Dodd-Frank Act in the wake of the 2008 crisis and, in response, banks hired staff by the thousands to comply with such stress-testing exercises as the US Comprehensive Capital Analysis and Review and the European Banking Authority’s EU-wide stress testing. Through most of the past decade, these initiatives were run almost exclusively within risk and finance functions. Given that they were often the gating factor for capital distributions, banks invested heavily in parallel risk architecture that eased the regulatory reporting burden. Along the way, a clear divide emerged between front-office pricing and risk management models developed for year-round regulatory and internal risk reporting.
While models in both camps have passed stringent internal validation standards, and perhaps even received explicit regulatory approval, each side’s model inputs and outputs must now reconcile to an unprecedented degree in the form of hypothetical and risk-theoretical P&L. It is possible that some banks may be able to align improvements to front-office and risk systems enough to pass the test, but the regulatory implication is that banks should stop running two parallel processes. Choosing between two existing systems requires an independent and unbiased assessment of existing architecture on both sides to determine where improvements could be made and efficiencies gained. For example, some banks are exploring cloud-based architecture to more efficiently scale computing power: speed will be at a premium if regulators remove AAD and risk sensitivities may require thousands of calculations per trade under the sequential ‘bump-and-revalue’ approach, rather than the simultaneous calculations afforded by AAD. Alternatively, some banks lacking internal quant resources have subscribed to third-party platforms that significantly reduce internal or supplemental development costs.
FRTB winners will not be judged by how effectively they meet FRTB regulation in a silo, but rather how well they integrate new and improved systems capabilities with related regulatory deliverables. For instance, the Basel Committee’s new initial margin regulation for uncleared derivatives requires a calculation “consistent with a one-tailed 99% confidence interval over a 10-day horizon based on historical data that incorporates a period of significant financial stress”. This value-at-risk-style calculation, including the International Swaps and Derivatives Association’s standard initial margin model, similarly requires front-office calculation of delta, vega and curvature sensitivities for each trade. Furthermore, the calculation of margin value adjustment and the inclusion of initial margin in regulatory capital requires an entirely new dimension, as forward sensitivities will also require simulation. Although this regulation has only come in to effect for the largest dealer banks (since September 1, 2016), the tiered applicability through 2020 necessitates that many of the same banks subject to FRTB start planning immediately for an integrated approach that minimises the duplication of work efforts.