Welcome to the fourth issue of Volume 10 of The Journal of Operational Risk. With this issue we close the landmark tenth year of the journal and I am proud to see that the pipeline of submitted articles is strong, as is our subscriber base.
Regulators, particularly those in Europe, have recently issued a number of challenges to the existing concepts of operational risk measurement and capital. Their argument is that operational risk capital has not yet reached maturity, despite more than ten years of practice and methodology. Banks' methodologies to calculate regulatory capital still failed to find more common ground and, therefore, economic capital varies significantly amongst banks making a benchmark exercise very difficult to perform. I would very much welcome discussions on this subject in the journal, particularly papers for the forum section.We have a very interesting paper in this issue's forum section discussing these topics and I would like to see more of these. In this issue we have three research papers and one forum paper.
In the issue's first paper, "A comparison of alternative mixing models for external data in operational risk", Roberto Torresetti and Giacomo Le Pera use real operational risk data to study alternative models for combining internal and external loss data. They claim that the widely used technique of scaling external data through a size proxy (eg, assets under management, revenue, etc) does not seem to be a sensible method for incorporating external data into a risk class loss distribution. The authors claim that moving to more sophisticated mixing models like kernel modified estimators and Bayesian estimators represents an improvement. They also demonstrate that their method is capable of further improving the treatment of external data in instances where a distinct power law governing the tail of the internal and external data exists.
In our second paper, "Application of the convolution operator for scenario integration with loss data in operational risk modeling", Pavan Aroda, Aziz Guergachi and Huaxiong Huang present a methodology that makes use of a convolution operator to integrate subject-matter-generated scenarios into operational risk models. Basically, they combine a baseline historical loss distribution model and a scenario-derived loss distribution by convolving their corresponding densities. The authors base their method on techniques used in digital signal processing, where the commutative property of convolution allows one function to smooth and average the other. The authors' method arguably addresses the inherent biases in scenario analysis and produces a combined loss distribution model that takes information from the entire domain of the calibrated scenario distribution instead of just the extremes. The underlying theory is carefully discussed in the paper and a numerical example is given, just as this journal strongly recommends.
In the third paper in the issue, "Random matrix theory applied to correlations in operational risk", François Crenin, David Cressey, Sophie Lavaud, Jiali Xu and Pierre Clauss note that measuring correlations among operational risk types has a potentially significant impact on regulatory operational risk capital figures. The authors focus on the distribution of this correlation and show that the distribution could exhibit some noise because of the structure of the data of operational risk losses. Consequently, the estimation of pairwise correlations and diversification benefits could lack accuracy. Noting that supervisory guidelines from the Basel Committee for Banking Supervision for the advanced measurement approach address the issue of the soundness and integrity of correlation estimates, they propose a framework based on random matrix theory to control the real levels of observed pairwise correlations and they avoid focusing only on the correlations' average.
This issue's forum paper, "Modeling operational risk capital: the inconvenient truth" by Patrick McConnell, presents a very controversial discussion about the role of operational risk capital in protecting banks. He notes that, since the financial crisis of 2008, large banks have incurred more than US$200 billion of operational risk losses, mainly as a result of regulatory fines, lawsuits and demands for customer redress for various types of misconduct. A basic assumption underlying the modeling of operational risk regulatory capital (ORRC) under Basel II is that such operational risk losses can be modeled as being idiosyncratic to an individual institution, as this is the level at which banks are currently regulated. This paper challenges that assumption and shows that it is an "inconvenient truth" that banks' largest losses are not firm specific. Instead, the largest losses involve multiple banks being fined at the same time by multiple regulators for the same types of misconduct. The author calls such large multibank incidents "systemic operational risk events" and argues that, as well as looking at firm-level problems, ORRC should also be modeled at the "systemic", or macroprudential, level. The paper also discusses arguments made by academics against current approaches taken to modeling ORRC and, finally, makes a suggestion to the Basel Committee that - like the review into market risk that is currently being undertaken - a comprehensive and fundamental review be undertaken into operational risk.
This paper studies alternative mixing models for external data for a particular risk class.
Application of the convolution operator for scenario integration with loss data in operational risk modeling
This paper addresses the uncertainty in scenario analysis and produces a combined loss distribution.
This paper focuses on the distribution of correlations among aggregate operational risk losses.
This paper shows that it is an "inconvenient truth" that the largest losses by banks are not firm specific.