Journal of Operational Risk

Welcome to the second issue of Volume 11 of The Journal of Operational Risk.

The much rumored and feared Basel consultative paper that is expected to pretty much retire the Advanced Measurement Approach (AMA) has finally been released by the Basel Committee. As outlined in the paper, the committee is proposing to apply a simple formula, using as input operational losses, to come up with regulatory capital for operational risk. Although much of the industry is still ruminating on how to respond, the paper has been like a bucket of cold water to the operational risk community.

When I first ventured into operational risk, about twenty years ago, the key challenge was to develop a methodology to assess the size of this risk in a financial institution's portfolio. Back then, I used to say that it is very difficult to manage a risk if one is unable to measure it. In my view, applying this proposed formula is not the same as measuring operational risk. Basel is proposing to abandon the freedom of internal models in exchange for being able to use a simple formula across the whole industry. In their view, this would create more standardization of operational risk exposures and reduce the cost, to them, of supervision of operational risk models. The problem is that this formula would be unlikely to assess the true operational risk embedded within a financial organization, thereby making it very hard to manage that risk. This consultative paper creates a much bigger problem in the industry and takes us back twenty years.

This new approach proposed by Basel, deemed the Standardized Measurement Approach (SMA), also has a number of potentially significant flaws, as has been seen in some early studies performed by practitioners; some examples follow.

  • SMA does not resolve the problem of operational risk capital instability across banks. Two banks with the same risk profile and size using the SMA can still have operational risk capital differences larger than 100%. Banks with very extreme operational losses, due to a legal settlement for example, will have larger capital than a similar sized bank that was not exposed to this large operational loss.
  • SMA removes operational risk sensitivity from the business and control environment by removing Key Risk Indicators from the framework. A number of banks have developed operational risk models that provide capital benefit for better risk management and improve controls within the organization. In the SMA approach this is not possible, arguably meaning that there are no incentives to manage risk well.
  • Banks will probably not focus on managing operational risk from day to day, but will instead work around the SMA formula and its parameters to get better results for their operational risk capital.

Instead of scrapping twenty years of hard-won evolution in operational risk measurement and management, Basel should instead focus on improving what they see fit to improve within the current AMA framework. The Journal of Operational Risk, as the leading publication in the area, would like to be at the forefront of these discussions and we would welcome papers that shed some light on the issues involved.

In this issue we have four research papers. Three of them deal with loss data heavy tail issues and their impact on estimation of capital, and one paper discusses the impact of operational risk methods on Indian banks.

In the issue's first paper, "Operational loss with correlated frequency and severity: an analytical approach", Daniel H. Stahl deals with the common issue in operational risk of the severity distribution often being very heavy tailed, making Monte Carlo simulations more complex to perform. This paper proposes a significant generalization of the loss distribution approach (LDA) model by treating operational risk as a Lévy jump-diffusion process, which enables autocorrelation in the frequency distribution and also allows for correlation between the severity and frequency distributions. Using the Runge-Kutta method, the number of steps needed to achieve accuracy throughout the loss distribution simulation is small: the author claims that in computation tests, as few as thirty-two steps give excellent accuracy. Following The Journal of Operational Risk editorial policy, in which we require authors to provide numerical application of their theories, the author's method is tested using three separate severity distributions. What is found is that the impact of the resulting correlations on the capital required for operational risk can be significant: in a computational experiment the required capital at the 99.9% level is over 55% larger than in a zero-correlation model.

In our second paper, "Operational risk: impact assessment of the revised standardized approach on Indian banks", Pankaj Sinha and Sakshi Sharma provide an interesting study of the impact of the Basel standardized approach on Indian banks. With the proposed new revised standardized approach for operational risk capital charge calculation, it might be of interest to see the change in capital estimations of Indian banks as per the revised framework. The authors conclude that, if adopted, the standardized approach will not pose a significant challenge in terms of capital requirements for smaller banks. Larger banks may show an increase in capital charges as per the revised framework; however, the estimated impact on Tier 1 capital should be minimal.

In "How to turn uncertainties of operational risk capital into opportunities from a risk management perspective", our third paper, Philippe Meunier and Arjan Bakker assume that the estimated capital charge is inherently uncertain due to the heaviness and scarcity of operational risk losses in the tail region. Going beyond the regulatory requirements to operational risk measurement, the authors propose leveraging the information embedded in the heavy tail of the loss spectrum to provide relevant business applications to a bank.

In the fourth paper in the issue, titled "Asimulation comparison of aggregation periods for estimating correlations within operational loss data", K. Panman, L. J. Haasbroek and W. D. Pieters try to verify whether aggregating loss data annually, quarterly or monthly has an impact on operational risk capital correlations. In order to answer this question the authors perform a simulation study covering a wide spectrum of loss frequencies, severity distributions and dependencies between severities. The main conclusion drawn from their study is that the difference in values of the correlation coefficients calculated from aggregate loss severities only becomes material when the inherent correlation in the loss generating process exceeds approximately 0.5. From a regulatory capital perspective, where annual aggregation is desired, the study proves that loss data aggregation periods shorter than a year would improve the stability of correlation estimates, and the diversification benefit due to estimating correlation values using a shorter aggregation period would not result in a material misstatement of the diversification benefit since the differences in the values of the correlations are minimal.

Marcelo Cruz

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here