Seizing the opportunity of transformational change

Little plant emerging from cracked soil

THE PANEL

  • Frank Heanue, head of presales for ERM, Murex
  • Andy McClelland, director of quantitative research, Numerix
  • Nick Haining, chief operating officer, CompatibL
  • Etienne Varloot, head of global markets regulatory strategy and quant research, Natixis
  • Hany Farag, senior director at a large North American bank
  • Richard O’Connell, global markets lead for risk, capital and regulatory change, Credit Suisse

With a final implementation date of end-2019 on the horizon, banks are looking to respond swiftly and effectively to the challenges posed by the new Fundamental review of the trading book (FRTB) framework. In this Q&A, sponsored by CompatibL, Murex and Numerix, a panel of market risk practitioners explores the revolution in data, advances in technology and rethinking governance, trading structures and hedging strategies.

What are the greatest challenges being faced by banks on the path to implementation?

Frank Heanue, Murex: In many cases the legacy systems of banks are either not up to the task of supporting the calculations required – for example, producing risk theoretical profit-and-loss (RTPL) attribution or lacking risk factor depth – or they cannot provide the data in a timely manner. Such systems often struggle to attain the required accuracy, at least for a subset of instruments, and to consistently align risk and trading results. In some instances, an overhaul of the current risk systems is sufficient to bring them closer to the front office. In many others, a new front-to-risk architecture is preferred, resulting in a broader project and potentially reducing the actual FRTB compliance window. Importantly, it is often difficult for the bank to make such decisions without an initial investigation involving a considerable amount of resourcing and investment. It is no small task to understand the impact of non-modellable risk factors (NMRFs) on the overall capital requirement or to assess the optimal granularity of risk factors needed to pass the P&L attribution test. 

Frank Heanue, Murex
Frank Heanue, Murex

Even for banks using the standardised approach (SA) only, older methods required inputs based on balance-sheet items, such as mark-to-market and notional, whereas FRTB-SA dictates a whole new set of additional reference data, risk and pricing capabilities. In terms of calculations, sensitivities need to be generated and hold consistency across multiple desks, and vega is now needed for product types other than options. Stress-test scenarios need to be defined and managed for curvature risk and the SA-default risk charge requires offsetting of weighted positions that can prove tricky to implement. Thus, the target solution needs to display the required levels of calculation and aggregation capabilities.

Furthermore, structural considerations such as potential desk reassignment or alignment of models and/or market data can only be addressed once conclusions are drawn from this analysis phase. These outcomes feed operational decisions and human resource elements of the project that can be time-consuming to address.

Banks must overcome a multitude of hurdles under FRTB: in data sourcing and management, in assessing current systems, in considering internal organisational and business challenges, and in terms of where to build and where to use existing vendor solutions. Many banks have, to date, been focused on other aspects of regulation, but now is the time to seriously examine the potential impacts of FRTB.

Andy McClelland, Numerix: Banks face a host of major challenges. Many are obvious, such as the challenge of setting a clear strategy to prepare for FRTB. But I would like to highlight two less obvious challenges. One is ensuring the bank’s personnel is prepared for FRTB. The scope of the new regulatory dynamic is massive and requires organising a strong and committed team of leaders who can make the tough decisions and be held accountable for them – and who have the capability to make the transformation meaningful, powerful and successful. A project team must also be mobilised, including representatives from trading desks, IT, risk and finance. 

The other challenge is that bank management will need to make three separate worlds co-exist. To help ensure a smooth and successful transition, it will be necessary to pull together the trading, risk and finance departments. These three functions must be closely aligned during the implementation process, as key issues and decisions will impact all three from both a workflow and technology set-up perspective.

Nick Haining, CompatibL: Other than the wide array of technical challenges imposed by FRTB, the key business challenge is the lack of certainty and finality in key aspects of FRTB, even as the current implementation schedule demands that FRTB projects get under way. The most significant provisions still to be finalised are the internal models approach (IMA) capital floor, the profit-and-loss (P&L) attribution challenge for a well-hedged portfolio, widespread non-NMRF challenges in all but the most liquid markets, and of course the delay in publication of the final FRTB-credit valuation adjustment (CVA) regulation. Until these provisions are finalised, banks will not be able to engage in confident FRTB project planning and determine the required technological and organisational challenges.

Hany Farag: The biggest challenge is the uncertainty we have at this stage. The P&L attribution test, as we understood it from the trading book group of the Basel Committee on Banking Supervision, is fairly difficult to pass. To achieve the required standard, we have to rebuild the risk systems across the industry to become essentially full revaluation, have matching models in front office and risk, and align the market data and risk factors between those functions. This sounds great in principle, but is very expensive to achieve. If the test is diluted or the definition in the glossary is not required in its strict form, the cost is suddenly an order of magnitude smaller. This uncertainty is not at all helpful. More uncertainty comes from the noises we are hearing from the US regarding regulations and the pushback that may ensue. Not having a level playing field is harmful to the global financial system and can ultimately lead to some jurisdictions walking away from the regulations if they feel it is a one-way street.

Another challenge arising from P&L attribution is the need to study the behaviour of different products in the test. If the test itself is not yet finalised, this remains a very challenging issue. We cannot determine if certain products, or even desks, are better off on the SA or the IMA. Furthermore, you cannot price your clients properly for long-dated trades – those that mature post-FRTB – if you cannot estimate your capital impact accurately. Nor can you decide which businesses to keep and which to exit if you cannot assess the capital cost to maintain them.

Richard O’Connell, Credit Suisse: Every bank has its own unique challenges; however, some issues seem to be universal for banks seeking to pass the RTPL alignment. 

A bottleneck at the moment is around technical issues with the statement of the tests, which result in model failure for ‘unintended’ reasons. These issues will need to be resolved with FAQs and technical changes to the rules by the Basel Committee on Banking Supervision. For example, a recent FAQ clarified that local closes can be compared with local P&L for banks that operate across time zones – there are many other issues that will hopefully be resolved in a similar fashion. 

Once these technical problems are addressed, banks will be able to address three common issues that, as intended, cause models to fail RTPL:

• Data lineage: ensuring every market data input is traced back to an arm’s-length transaction or quote.

• Proxies: insufficient granularity of market indexes to match position-level P&L – for example, mapping all stocks onto Standard & Poor’s 500.

• Model imperfection: generating accurate RTPL for large portfolios of offsetting trades – for example, a swap book – requires far greater precision than for individual trades, such as a single swap.

The issues involved in addressing data lineage should not be underestimated; however, it is relatively simple to explain and a well-defined problem.  Additionally, for many banks it is a new requirement, so there are no legacy systems to fix. 

Proxies and model imperfection, on the other hand, are not so well defined. It is not clear how many indexes must be expanded into sub-indexes, or how many second- and third-order effects must be incorporated, before the RTPL test can be passed. These changes must also be made to legacy systems currently in use for day-to-day risk management and capital adequacy calculations.

 

How will banks’ tech strategy and spending need to change as they implement FRTB

Andy McClelland: To meet FRTB’s requirements, banks will need to rethink – and probably completely overhaul – their technology strategies and can expect to spend at least tens of millions

Andy McClelland, Numerix
Andy McClelland, Numerix

doing so. This will likely require a change in a bank’s technology philosophy. Firms will be pushed to re-evaluate the legacy software and analytics in their arsenals and explore new, more powerful hybrid technologies and methodological approaches that are open-ended, agile and transparent. The demands will include technologies that can meet the massive increase in data integration, data storage, data validation and computational power requirements, as well as open-source ecosystems that bring the data and compute environments together.

Given this, I see banks’ technology strategies changing in three ways:

• Banks will conduct more comprehensive analyses to identify gaps in existing infrastructure. It is important to be aware of the two core elements within the technology infrastructure: computational requirements and data management requirements, and the options for each are diverse. 

• New kinds of delivery model for the new architecture will be explored. Banks can use software-as-a-service, they can have delivery on premises or they can have it in a cloud. Solutions deployed as a hosted service in a private cloud can facilitate rapid installations, streamline updates, enable high operational efficiency and lower total cost of ownership (TCO) compared with on-premises software.

• Banks will change their perceptions around ‘build versus buy’, letting go of the belief that building all technology in-house is the only option. That time is gone; the decision now is which elements of the architecture should be built in-house and which can be better met by best-of-breed technology vendors. The regulatory climate has been moving banks away from building technology in-house to rely more on third-party providers. These vendors specialise in developing highly customisable technology, which can serve as a competitive advantage for a bank. 

Nick Haining: For the banks’ IT functions, FRTB poses a unique set of challenges to the traditional way of delivering risk software. Of these new challenges, the most dramatic is the need to reconcile risk models with front-office models to such a degree of precision that the best and sometimes only way to achieve it is to call the front-office pricing model from risk software. While this seems easy in theory, the practical challenge of adopting a front-office model for the use within risk software is enormous. This is why banks and software vendors that can provide risk models accurately matching front-office P&L will enjoy a considerable head start as FRTB implementation projects get under way. With regard to spending, the change from firm-wide approval to desk-level approval for IMA both lowers the plank for gaining IMA approval and dramatically reduces its cost, if pursued for a specific trading desk or line of business. This has the effect of moving the spend on IMA approval from being part of the overall strategy of the bank to being part of the strategy of a business or even a single desk.

Frank Heanue: It could be argued FRTB has a greater impact on banks’ IT decisions than any other regulation. As for most new regulation, data availability, data quality and volumes handling are a huge concern – especially where the inability to source, manage or validate such datasets can lead to penalties of large capital requirement increases. Accuracy and timeliness of calculations are also primary factors, as well as the ability to drill down results to understand and reconcile any discrepancies and misalignments. Solutions must be performant, scalable, robust and have the flexibility and openness to adapt to changes. Take performance, for example; it is not as simple as throwing hardware at the problem: the key factor to any solution is software optimisation designed with FRTB in mind and, in particular, eliminating redundancy in calculations by performing ‘never twice’ calculations.

In addition to software solutions, banks will also look for help from hardware and infrastructure changes. For example, many banks are leveraging grid solutions – using graphics processing units and central processing units (CPUs) – and exploring how cloud and other outsourcing can provide all or part of the FRTB solution where regulatory constraints do not exist. Of course, all these IT decisions need to be made while considering overall costs, benefit to the FRTB project, timeliness of solution delivery, synergies that can be realised with other projects and the long-term strategy of the organisation.

Etienne Varloot, Natixis: One of the novelties of FRTB is the willingness by the regulator to merge the front-office and the risk-pricing models, which are clearly captured by the P&L attribution test. In its current wording, the test is so stringent and the risk of failing from a data standpoint already so material, most banks are unwilling to risk a major models gap between risk
and front office. This front-to-risk integration is new and has some significant tech implication(s). 

First, the cost of computing risk metrics is skyrocketing – generating an expected shortfall or value-at-risk (VAR) computation through an elaborate autocall, target redemption forward or Bermudan swaption pricer is prohibitively expensive. Second, risk and front-office IT departments were very independent under Basel 2.5, but the new framework is pushing them towards greater integration; a similar pattern can be observed in effective expected positive exposure or derivatives valuation adjustment computation. One should expect some governance or organisational chart changes as well as a new approach to IT budget management.  

 

Does FRTB present an opportunity for transformational change?

Frank Heanue: Certainly. FRTB provides a unique opportunity for banks to critically analyse their infrastructures with a view to streamlining their architecture, as well as meeting the shorter-term regulatory requirements. 

Requirements around static data have not only grown in terms of volume and type of data required, but the bank must ensure consistency of data across systems while being conscious of the overlap and potential synergies with other regulatory requirements, in particular initial margin and Markets in Financial Instruments Directive (Mifid) II data transparency.

FRTB may force organisational change as some desks become too expensive to run as a viable enterprise. Others may be merged to reduce capital burden or to boost the chances of passing the P&L attribution tests. Additionally, new lines of business and relationships may need to be developed to address client needs and to prevent non-diversification or over-concentration of business activities. FRTB transformation also offers the opportunity to make communication channels within and across business lines more efficient, and to provide transversal management structures for such processes across the front office, risk, finance and IT. Similarly, it forces the bank to consider alignment of different close-of-business criteria across geographically dispersed entities.

In IT infrastructure, the new regulation provides – and, in some cases, dictates – the opportunity to review and enhance front-office and risk systems. In fact, many banks are already using the opportunity of FRTB to replace legacy systems and to more closely align risk and trading. Other customers are looking at how FRTB fits into the new regulatory framework and how synergies can be realised across regulations such as the standard initial margin model (Simm), SA-counterparty credit risk (SA-CCR) and SA-credit valuation adjustment (SA-CVA), among others.

Nick Haining, CompatibL
Nick Haining, CompatibL

Nick Haining: We believe that FRTB presents a unique opportunity to improve the quality of the models and processes in risk management to a greater degree than previous iterations of the Basel accords required. One of the key drivers of this change will be the need to meet P&L attribution and NMRF requirements. Even if, as widely expected, the P&L attribution and NMRF criteria are relaxed or amended, the need to reconcile front-office P&L with the risk model and carefully document the origins of the market data inputs will drive systemic improvements to the quality of risk management models, processes and data.

Etienne Varloot: Considering the budget and the amount of governance and workflow changes required by FRTB, it would be a missed opportunity not to leverage this mandatory change into something transformational, especially if you consider a similar drive from parallel regulation – the US Federal Reserve Board’s Supervisory guidance on model risk management (SR11-7), the Volcker rule and Mifid II. The major impact could be in product and model governance integration, as the model will be shared more by front office and risk, P&L explanation integration between risk and independent price verification, or model validation between risk and pricer. 

Andy McClelland: With disruption comes opportunity. I believe a comprehensive and successful implementation of FRTB will, in several ways, result in positive transformational change for banks. One outcome is a diminution of risks to a bank’s reputation. Making the investment in the technology will help ensure financial stability and bank solvency. In addition, having the best risk management technology in place sends the message that the firm is dedicated to protecting the clients it serves. 

Another positive outcome is a gain in cost benefits and efficiencies. The use of the cloud and other technologies, for example, can enable quick deployment, enhanced speed, faster model performance and a lower overall TCO. The more rapid computation of the incremental capital impact or margin impact of trading decisions may enable firms to eventually achieve greater efficiency. With more capital at their disposal and not tied up in capital charges, and lower aggregate margin requirements, banks can put these resources to work and focus again on profitability. 

Richard O’Connell: Absolutely. If a firm invests in the system changes to bring risk calculations to a level that passes RTPL alignment requirements, it will find considerably more ways to use this information. For example, if you can attribute P&L moves at a granular level daily, you can sum up daily moves over a quarter, and discuss secular drivers of P&L with confidence and accuracy.

Hany Farag: Yes, it is an opportunity for the risk function to upgrade its skill set and up its game. It is an opportunity for the bank to align models between the front office and risk for better measurement and faster time-to-market for various products. It is an opportunity for finance, risk and the front office to align their processes, automate them and cut costs – eventually, though with a steep climb at first. It is an opportunity for capital market executives to re-examine their businesses, assess profitability, risk-reward trade-offs in the new framework and capital costs, and re-strategise for the next five to 10 years. Most banks are expected to do this, as redefining desks and their strategies at this point in light of FRTB and other regulations is a must. Return on equity (ROE) will be heavily impacted by FRTB, and one cannot miss the opportunity to use this lever to optimise ROE for the next decade.

 

What are the challenges and potential systemic risks posed by the standardised approach?

Nick Haining: The main challenge of the SA, in FRTB as well as in earlier iterations of the Basel framework, is the lack of precision in risk sensitivity. This drives use of conservative calibration, in turn causing higher capital levels for firms unable or unwilling to undergo the arduous process of obtaining IMA approval. The use of the SA by most market participants may also lead to concentration in certain thinly traded hedge instruments, increasing the potential of a liquidity crunch. A different challenge unique to the SA is that, unlike the Simm, it lacks provisions for recalibration. The regulatory risk weights and correlations will increasingly fall out of sync with the relevant historical periods. If the markets pass through periods of stress in the future, the lack of global recalibration provision in the SA may cause some of the country supervisors to unilaterally impose additional capital multipliers, destroying the level playing field the Basel capital accords aim for. 

 

What operational changes will result from FRTB implementation around desk structure and internal risk transfer (IRT) practices? 

Etienne Varloot: Implementing FRTB desk structure creates many challenges because it is a multi-layered problem and a company-wide endeavour. Constraints such as ‘one trader per

Etienne Varloot, Natixis
Etienne Varloot, Natixis

desk’, ‘one budget per desk’ or ‘IRT back-to-back risk reversal for non-IR with external firms’ are not current market practice(s) and require changing the organisation and responsibilities across the firm. In the special case of a banking mutual group, the IRT issue needs extra care to handle the group’s overall asset-liability management. The new IRT rule also seems to be pushing banks to move into their trading book some functions traditionally handled by their treasury in a banking book, such as Euro Medium-Term Note issuance. 

Other considerations are compatibility with the Volcker desk structure and its key information ratios, the domestic banking law desk structure and homologation rationale: are the risk axes homogeneous? Is this an NMRF-prone desk? How different are the front-office and risk models? What is the CPU cost of ES computation? Do we even bother adopting the IMA

A key concern is the homologation strategy: which desk is due to be IMA-homologated and when? Keeping a desk under the sensitivity-based approach (SBA) would burden it with extra capital, but the cost of going down the IMA road and the likelihood of failing homologation may be so high that the overall net present value (NPV) of the project may be negative. This is all the more true in that the announced phase-in is pushing funding cost gains linked to IMA into the future, whereas investment costs are still current. This desk-by-desk NPV IMA or SBA computation is also difficult as one needs to estimate the SBA-to-IMA floor level and future CPU grid costs reduction.

This raises the final question: if a desk stays with the SBA, will other IMA desks subsidise its capital valuation adjustment charge? The challenge is a mix of extremely technical considerations, some decisions with serious end-of-year desk economic-value-added impact and organisation chart adjustments. The bank’s management is therefore solicited to make the required decisions. 

Andy McClelland: Viewing the desk structure question from a purely capital perspective, things boil down to which types of desks are most likely to gain and maintain IMA approval. There is an incentive to keep desks small to minimise cliff effects, in that, if a small desk loses IMA approval due to poor P&L attribution performance, the impact on the bank as a whole might be less significant. On the other hand, dedicated desks dealing only in exotic products, which require complicated pricing models, might find the task of achieving adequate P&L attribution performance overwhelming. Indeed, middle-office risk systems have long used simpler pricing models than their front-office counterparts, and such differences will be heavily scrutinised under the new IMA approvals framework. The problem is probably even worse when considering that lower-order risks will likely be hedged by these desks through internal hedging efforts.  

Nick Haining: One of the novel aspects of FRTB compared with the current regulations is the stringent set of requirements used to prevent regulatory arbitrage though aggressive IRTs between the trading book and the banking book. Banks that previously relied on risk transfers and hedging across the formal trading desk boundaries will have to restructure the organisational structure of their front offices or face a significant increase in their capital requirements.

 

The P&L attribution test and risk factor modellability criteria under FRTB are pulling banks’ risk models in conflicting directions. What are your thoughts on this?

Andy McClelland: In short, there is a trade-off between P&L attribution versus NMRFs. Using more risk factors makes it easier to pass the P&L attribution test, which means a bank can obtain approval for using IMA. However, using more risk factors also means some are likely to be non-modellable, as there will be less data to support each individual risk factor. NMRFs are penalised with a separate stress-scenario charge, which pushes up IMA capital costs. This may be very frustrating for dealers, leaving them with the difficult task of determining the optimal trade-off between the two competing pressures. 

 

What challenges does FRTB’s non-modellable risk factor (NMRF) framework pose for data sourcing and management?

Hany Farag
Hany Farag

Hany Farag: There are three components to NMRFs, two of which – data sourcing and governance – go together and are already a big challenge. It also takes a rigorous governance process to keep track of the required information on modellability and more information has to be resourced to satisfy the criteria for modellability. However, from my perspective, an even greater challenge is to efficiently model these risk factors to reduce their capital charges in the spirit of the rules, which takes a fairly sophisticated infrastructure and creative modelling choices. One is compelled to do so in order to achieve high efficiency and compete in the market place – the infrastructure and its maintenance is non-trivial and certainly costly, but ultimately worth it.

Nick Haining: Before FRTB, data providers were able to use advanced interpolation algorithms to work around stale data and lack of liquidity in certain market segments. Previously, banks using the data did not need to document the interpolation algorithms used to obtain, for example, a continuous volatility surface from a limited set of data points. Under FRTB, for the data to be acceptable as input to certain calculations, it will have to rely on actual trades or executable quotes and the bank will need to have full transparency with respect to the data quotes and interpolation algorithms. 

 

What challenges does FRTB pose for client relationships?

Nick Haining: Under FRTB, some banks may face dramatic increases in capital requirements for the type of trading they historically relied on to win their unique client relationships. As an example, a regional bank may have built valuable client relationships providing liquidity in names traded primarily in their respective home markets. Under FRTB, this service may face NMRF charges that can severely limit their ability to continue providing this important service to their clients. 

 

What governance challenges are there for data pooling and sharing initiatives?

Hany Farag: Data pooling is very promising and may be the wild card that can bring the IMA to capital-neutral in its impact – or possibly capital-reducing. However, there are many moving parts. We need regulatory clarity as to the requirements for data pooling and proper governance. Some banks want to collaborate, yet others – often quite large – prefer to go it alone and not share anything, which can lead to the lack of a level playing field and other problems. It is hoped the regulators can establish a clear guideline that maintains a level playing field without being too onerous. If trade information is out there, I predict it will ultimately become more transparent and traders will opt for capital reduction by disclosing more trade information.

Nick Haining: Because FRTB requires that market data inputs are based on actual trades or executable quotes, a data pooling provider would have to receive and store more information about the trade, including its counterparty, than would normally be collected by a typical consensus-based data source. The ability to ensure proper stewardship and confidentiality of this highly sensitive data, as well as legal issues surrounding its disclosure to the data pooling service, will have to be addressed for the data pooling to attract the critical mass of contributors.

 

What are the challenges and potential systemic risks posed by the standardised approach?

Hany Farag: When you have a crude capital measure that is punitive, it is natural for traders to look
for the highest-efficiency products. These are likely to have higher risk than the ones they trade today, but are more capital-efficient in FRTB. Given the SA itself is a simplistic model, it is likely traders will find themselves trading products that have risk-to-capital ratios that are higher than we would like. In other words, the SA will be too crude for these products and may not capitalise them properly. 

There are other ambiguities and anomalies with the SA. For example, it uses maturity of instruments to allocate the sensitivities on the term structure – yet a swap can mature in 10 years. Should all its sensitivities be summed up and allocated to a 10-year point? That makes no sense, and would lead to all kinds of bizarre anomalies and incorrect risk measurements. We also have asymmetries in foreign exchange risk, where the SA seems to favour USD-reporting banks. This seems unintentional but we found extremely problematic examples that demonstrate anomalies of up to 400% for some portfolios. This is not a level playing field at all. I hope these and other deviations will be addressed with an open mind by the regulators.

 

What does FRTB mean for enterprise-wide capital optimisation and product choice for clients?

Etienne Varloot: In the current low-yield environment, some popular retail products are relying on structured coupon pick-up. To generate extra spread, these payouts may monetise some illiquidity premium – for example, long-dated out-of-the-money sensibility and its related unobservable risk basis, and Greeks. FRTB is rightly demanding more capital for that type of risk axis through the NMRFs or residual risk add-on (RRAO) capital charges.

To avoid accumulation of these risk and capital charges, it is likely that the risk budget allocated to those products will be more constrained by risk limits. Another avenue may be risk mitigation offered by new solutions-advanced investors not affected by the FRTB framework and able to carry those tail risks on their balance sheets and/or alternative funds.  

Nick Haining: By imposing stringent trading-desk definitions and disallowing offsets across trading-desk boundaries, FRTB will force banks to reorganise their front-office hierarchies in a way that benefits enterprise capital optimisation. This will lead to trading desks being organised according to the risks they must hedge, and not the role they play in client relationships. A trading-desk structure created for this purpose may lead to a reduced ability to focus on unique client relationships. As for product choice implications, RRAO and NMRF provisions in FRTB are designed to penalise and discourage risk-taking outside the primary risk classes and outside linear instruments or vanilla options. While this will reduce the likelihood of severe losses due to trading in complex financial instruments, it will considerably limit product choice for the clients.

Richard O’Connell: For enterprise-wide capital optimisation, a typical concern is maintaining a diverse portfolio of businesses that results in a sub-additive VAR calculation for the entire bank.

Richard O'Connell, Credit Suisse
Richard O’Connell, Credit Suisse

With FRTB’s ability to force desks out of the IMA and on to the SBA, things can be both simpler and much harder. For a bank that is entirely on the SBA, there is much less diversification benefit available – a simple sum across desks might be a good estimate. Conversely, for a bank that has a variety of businesses that might be on either the IMA or SBA, a proper optimisation analysis must now consider many different scenarios where individual desks are either on the IMA – and thus potentially increasing diversification – or on the SBA, in which case the diversification benefit for the remaining IMA desks is likely reduced. Accounting for all the possible permutations of IMA/SBA desks will be extremely challenging.

 

Read more articles from the FRTB special report

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here