Action reactions

At OpRisk USA 2010 participants discussed how they can deal with increasing waves of regulation and add value in the new financial environment.

opriskusa-odell-2010

Regulatory change and its impact on operational risk management was the hottest topic at the 12th annual OpRisk USA conference held at the Mandarin Oriental in New York in March. The keynote address was delivered by Mark O’Dell, deputy comptroller for operational risk at the Office of the Comptroller of the Currency (OCC), who was standing in for Kevin Bailey, deputy comptroller, regulatory policy at the OCC and chairman of the Basel Committee’s Standard Implementation sub-Group on Operational Risk (Sigor). O’Dell gave a thorough overview of the work the Basel Committee had completed following the crisis and what it is currently working on. The central message here was that although Sigor was working on some key areas to help improve operational risk management under Basel II, any developments would not affect the flexibility of the core system, and would seek to facilitate the continuing evolution of the operational risk discipline, while maintaining its goal to harmonise supervisory expectations of the advanced measurement approach to op risk management.

O’Dell focused on four main areas being explored by the working group: the use of insurance as a mitigant (see news story on page 12 for more on this); governance – how operational risk management is being integrated into the day-to-day processes of the bank; data – clarifying the critical reporting protocols in the AMA as well as looking at gross versus net loss, group loss and legal loss; and modelling – assessing the granularity and distribution of the four elements within op risk frameworks. The work being done for the sixth Quantitative Impact Study (QIS 6) relating to operational risk was also outlined.

QIS 6 is being viewed as a mini loss-data collection exercise (LDCE) for operational risk data, according to O’Dell. Banks are being asked to supply several pieces of information for op risk measurement in a manner identical to the 2008 LDCE. Data collected during that exercise only measured loss data points until 2006 – the QIS data will look at loss data for the crisis period. These results will help better inform the Sigor on changes required in the Basel II Accord.

The lively panellists that populated the first regulatory panel of the day, who also serve on Sigor, responded to questions posed by the audience to the O’Dell presentation, and also discussed how the changes to Basel II will affect op risk. Ronald Stroz, assistant vice-president and head of the operational risk group at the Federal Reserve Bank of New York, suggested the events of September 2008 demonstrated that chief executives hardly ever consulted with their op risk managers, and that the op risk community needs to take a hard look at their function to deduce how they can get that all-elusive seat at the top table to ensure their warnings are heard and ideally heeded.

“Infrastructure weaknesses, particularly in IT, were highlighted during the crisis,” said Stroz. “If firms can’t aggregate liquidity risk, market risk and credit risk data in hours to understand where their weaknesses are, which was demonstrated during the stress-testing exercise, they cannot have a robust op risk management system in place.” Internal audit should also take some of the blame, according to Stroz, as they didn’t take a hard enough look at governance structure and how risk appetites were being set.

The fact operational risk management has been left out of the most recent enhancements to the Basel II Accord was not lost on the panel and Adrienne Haden, assistant director, banking supervision and regulation of the Board of Governors of the Federal Reserve System, agreed that the losses categorised as market or credit risk events had operational risk aspects to them that were not calculated, and that the impact of operational risk in current risk frameworks has been underestimated. “Boundaries between risk types are useful, but they can also hide a high level of interaction between the risk types,” she said.
Alfred Sievold, senior examination specialist, large banking group, Federal Deposit Insurance Corporation (FDIC) stressed that although he felt the risk disciplines need to remain separate, operational risk managers need to push for softer boundaries to be able to get a more holistic and accurate risk profile.

AMA still valuable?
One of the more controversial questions asked whether panellists thought the advanced measurement approach for operational risk management was still valuable. There was a general agreement that it was, but the crisis has highlighted firms’ over-reliance on models without expert judgement. The Fed’s Haden said the collection and modelling of op risk data was an invaluable exercise for banks’ risk management, but that firms needed to use this data to better effect by extending the data series to better understand the root cause of the loss. Some firms, she said, were using this to map insurance to their op risk losses, which was a pleasing development.

Taking a broker-dealer perspective, Grave Vogel, executive vice-president, member regulation at the Financial Industry Regulatory Authority (Finra), stated that even though new products have become more complex and profuse, firms haven’t updated their technology systems, with many still using spreadsheets. “We are reminding firms they need to spend tech dollars to build processing systems that flow to the back office to be able to prepare financial statements more accurately,” said Vogel. She also pointed out that reconciliations were still not being done on a timely basis and because of the availability of naked access for customers and broker dealers, fat finger errors were on the rise. Finra is working with firms to put filters in place to deal with the latter and, since the Société Générale rogue trader event, it is also conducting a review of broker-dealer controls.

After lunch, guest speaker David Leinweber, fellow in finance at the Haas School of Business at UC Berkeley, founding director of the Center for Innovative Financial Technology at Berkeley, and author of Nerds on Wall Street, provided a lighter look at the ongoing transformation of markets by technology in an illustrated history of wired markets. He traced the developments that revolutionised the financial markets from the telegraph, the invention of Morse code, which he dubbed the “Victorian internet”, to ticker tape machines and eventually computers. Moving onto the latest financial crisis, Leinweber noted that quants were noticeably absent from Time magazine’s list of the top 25 people to blame for the crisis, but a great deal of blame could be placed on Wall Street ‘nerds’, or quants, relating to the lack of high-quality quantitative data, and complex models of complex products.

Later in the afternoon, Donald Rosenthal, senior vice-president, enterprise risk management at State Street, gave a presentation on how the role of quantitative methods in AMA modelling is in flux. One of the main problems identified by Rosenthal is that there is no generally accepted methodology, as industry convergence has yet to occur. Additional problems include the issue of truncated data, the fact that extrapolation to the tails of distributions is based on sparse data, some of the underlying statistical distributions are not standard and require customized software or code, and that the representativeness of data, particularly external data, is unknown. The specific implications of behavioural economics are unclear, and the fact that there are still so many different ways to bring scenario-based qualitative judgements into a model also exacerbates the problem.

Overconfidence bias
Day two kicked off with an illuminating presentation from Sanjay Sharma, chief risk officer of global arbitrage and trading at RBC Capital Markets, who drew some fascinating comparisons with how operational risk events are managed in other industries, including the Three Mile Island near-miss in the nuclear power industry and the Challenger and Columbia missions at Nasa. Sharma examined how human behaviour influences risk management decisions. One Nasa example he gave demonstrated perfectly the overconfidence bias in risk management. The Space Shuttle Columbia disaster in February 2003 was caused as a result of damage sustained during launch when a piece of foam insulation broke off and struck the left wing. This damaged the thermal protection layer, which protects the shuttle from heat generated on re-entry, and as a result the craft disintegrated during its return to earth. Although Nasa mission engineers’ expectation of total loss of a shuttle was 1 in 100 launches, because the majority of shuttle launches recording such foam strikes all had no effect on the re-entry, senior management expected total loss was 1 in 100,000 launches. Overconfidence in financial models was one of the key causes of the financial crisis, and one that should be carefully monitored by risk managers. “You see this in traders all the time,” said Sharma. “When they lose millions of dollars one day they are convinced they can make it back the next.” (Sanjay Sharma will be writing a series of articles based on his presentation at OpRisk USA in forthcoming issues of Operational Risk & Regulation.)

Mark O’Dell again took to the stage, this time with Patrick de Fontnouvelle, vice-president in the supervision and regulation department at the Federal Reserve Bank of Boston, to present conclusions from the 2008 LDCE. The main finding discussed was the fact that the op risk capital numbers from AMA firms in Europe Japan and the US were very close to each other, even though the models were all very different. The numbers were also just below the range of the betas used for the standardised approach (TSA) for op risk under Basel II. This either shows that all of the models are well calibrated or that AMA firms are using the TSA betas as a guide or benchmark for their AMA capital number. Another expectation that wasn’t borne out was that the capital number for AMA models that were scenario based and models that relied mostly on the loss data collection approach would vary widely. In fact, the two provided roughly consistent results. These results among others identified in the Basel Committee’s Range of Practice paper were also discussed, and it was announced that Sigor would be looking at a number of the modelling issues, including correlation, frequency distribution, granularity and the use of the four elements of op risk: both internal and external loss data, scenario analysis, and business environment and control factors.

A panel moderated by Marcelo Cruz tackled the tough subject of getting the balance right between quantitative and qualitative op risk modelling techniques, while de Fontnouvelle and Andrew Sheen, operational risk policy team manager at the UK Financial Services Authority (FSA) discussed whether the TSA betas needed to be reformed – which is also under consideration by a Sigor working group.

After lunch, Mark Abkovitz, professor of civil engineering, director, Vanderbilt Center for Environmental Management Studies at Vanderbilt University and author of Operational Risk Management – A Case Study Approach To Effective Planning and Response, sought to convince the audience of hardened op risk professionals that they need to take a climate risk physical. He urged them to consider whether their current operational risk management programme accounts for climate change risk; assess what climate change scenarios are reasonably foreseeable; and assess the likelihood and consequence of each scenario. His message seemed to have resonated with the audience, as on the final panel of the conference discussing the op risk challenges for 2010 and beyond, panellists noted that climate change risk would be something they would ensure was present in their business continuity plans.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe

You are currently unable to copy this content. Please contact info@risk.net to find out more.

Financial crime and compliance50 2024

The detailed analysis for the Financial crime and compliance50 considers firms’ technological advances and strategic direction to provide a complete view of how market leaders are driving transformation in this sector

Investment banks: the future of risk control

This Risk.net survey report explores the current state of risk controls in investment banks, the challenges of effective engagement across the three lines of defence, and the opportunity to develop a more dynamic approach to first-line risk control

Op risk outlook 2022: the legal perspective

Christoph Kurth, partner of the global financial institutions leadership team at Baker McKenzie, discusses the key themes emerging from Risk.net’s Top 10 op risks 2022 survey and how financial firms can better manage and mitigate the impact of…

Emerging trends in op risk

Karen Man, partner and member of the global financial institutions leadership team at Baker McKenzie, discusses emerging op risks in the wake of the Covid‑19 pandemic, a rise in cyber attacks, concerns around conduct and culture, and the complexities of…

Moving targets: the new rules of conduct risk

How are capital markets firms adapting their approaches to monitoring and managing conduct risk following the Covid‑19 pandemic? In a Risk.net webinar in association with NICE Actimize, the panel discusses changing regulatory requirements, the essentials…

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here