Journal of Operational Risk

Welcome to the third issue of Volume 11 of The Journal of Operational Risk. With the entire operational risk community focused on the controversial Basel Committee consultative paper that has created the standardized measurement approach (SMA) and ended the advanced measurement approach (AMA), I asked the members of our editorial board to write technical papers analyzing the impact of these new rules. The journal was lucky enough to have three board members respond to the challenge, allocating many hours of their valuable time to performing this analysis and writing very interesting pieces on the consequences of these rules for bank capital. I imagine that most of us reading these papers are deeply involved in discussions on the impact and consequences of these changes and will therefore appreciate more high-quality views on the topic.

At OpRisk North America and Europe, which took place a few months ago, SMA clearly dominated discussion, with very animated engagements between practitioners and regulators. As readers will see in the two papers discussed below that analyze the consequences of SMA, the new approach is risk insensitive, ie, there is no connection between managerial actions for managing risk and the operational risk capital calculated by the SMA. Although SMA is not the official rule until the final paper is published by the Basel Committee and the new standards are confirmed and promulgated by the participating countries, we ask readers and authors to submit their analyses to The Journal of Operational Risk; even if their papers are less technical, we will review them and publish them in our Forum section. As the leading publication in the area, The Journal of Operational Risk would like to be at the forefront of these discussions and we would welcome papers that shed some light on these discussions.

In this issue we have four technical papers. Two of the papers deal with an analysis of the SMA, one paper deals with data and another tackles statistical issues around the quantification of operational risk.

In our first paper, "Should the advanced measurement approach be replaced with the standardized measurement approach for operational risk?", two of our board members, Pavel Shevchenko and Ariane Chapelle, along with their regular collaborator Gareth Peters and also Bertrand Hassani, discuss and analyze the weaknesses and pitfalls of SMA, such as instability, risk insensitivity, super-additivity and the implicit relationship between the SMA capital model and systemic risk in the banking sector. They also discuss the issues with a precursor model proposed by the Basel Committee, which was the basis of the SMA. The authors advocate maintaining the AMA internal model framework and suggest as an alternative a number of standardization recommendations that could be considered to unify internal modeling of operational risk.

In the issue's second paper, "Comments on the Basel Committee on Banking Supervision proposal for a new standardized approach for operational risk", regular contributors Giulio Mignola and Roberto Ugoccioni, together with our board member Eric Cope, study the behavior of the SMA under a variety of hypothetical and realistic conditions, showing that the simplicity of the new approach is very costly in several ways. Among their findings, their study shows that the SMA does not respond appropriately to changes in the risk profile of a bank (ie, it is risk insensitive, as the first paper also showed), and that it is incapable of differentiating among the range of possible risk profiles across banks; that SMA capital results generally appear to be more variable across banks than AMA results, where banks had the option of fitting the loss data to statistical distributions; and that the SMA can result in banks overinsuring or underinsuring against operational risks relative to previous AMA standards. Finally, the authors argue that the SMA is not only retrograde in terms of its capability to measure risk, but it also, perhaps more importantly, fails to create any link between management actions and capital requirement.

In the third paper in the issue, "An assessment of operational loss data and its implications for risk capital modeling", Ruben D. Cohen employs a mathematical method based on a special dimensional transformation to assess operational loss data from an innovative perspective. The procedure, which is formally known as the Buckingham (Pi) Theorem, is used widely in the field of experimental engineering to extrapolate the results of tests conducted on models to prototypes. When applied to the operational loss data considered here, the approach leads to a common and seemingly universal trend that underlies all the resulting distributions, regardless of how the data set is divided (ie, by event type, business line, revenue band, etc). This dominating trend, which appears to also acquire a tail parameter of 1, could have profound implications for how operational risk capital is computed.

In our fourth and final paper, "Rapidly bounding the exceedance probabilities of high aggregate losses", Isabella Gollini and Jonathan Rougier take on the task of assessing the right-hand tail of an insurer's loss distribution for some specified period, such as a year. They present and analyze six different approaches - four upper bounds and two approximations - and examine these approaches under a variety of conditions, using a large event loss table for US hurricanes. They also discuss the appropriate size of Monte Carlo simulations and the imposition of a cap on single-event losses. Based on their findings, they strongly advocate the Gamma distribution as a flexible model for single-event losses, because of its tractable form in all of the methods.

Marcelo Cruz

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here