Journal of Operational Risk

Spring and summer bring with them a number of important operational risk conferences, among them OpRisk North America and OpRisk Europe, both of which are organized by the publishers of The Journal of Operational Risk. These gatherings present me with a good opportunity to catch up with operational risk practitioners that I have known for a long time and also to observe what financial firms and regulators are doing in the area. This year's meetings suggest that there are likely to be changes in operational risk regulation, particularly when it comes to capital requirements, where the minimum ratios in both the Basic Indicator Approach and the Standardized Approach need to be increased in order to catch up with a perceived increase in operational risk in the industry. Indeed, just in the last few months there have been serious system events that have incapacitated large institutions: for example, the Chicago Board Options Exchange went offline for most of a trading day because of a system fault. On that same day, Ben Bernanke, the president of the Federal Reserve Bank, alerted the market to "the dangers of operational risks". The Office of the Comptroller of the Currency has also stated that operational risk is at the top of their agenda.

In light of the heightened vigilance on the regulator's side, it was inspiring to see that a number of practitioners are focusing on trying to manage operational risk using the tools developed in recent years. Many firms are increasingly using scenario analysis as a management tool, and also making better use of control environment indicators to monitor and manage risk. We expect to see some of these new methods published in this journal soon.

I would like to ask potential authors to continue to submit to The Journal of Operational Risk regarding the state of operational risk research, and I would again like to emphasize that the journal is not solely for academic authors. Please note that we do publish papers that do not have a quantitative focus, with one example of this in the current issue. We at The Journal of Operational Risk would be happy to receive more submissions containing practical, current views of relevant matters as well as papers focusing on the technical aspects of operational risk.

RESEARCH PAPERS

We have two very interesting technical papers in this issue of the journal: one of these reports the authors' studies on applying techniques to cope with a lack of data in a fast-growing market and the second reports a novel nonparametric approach to assessing risk.

In the first paper in the issue, "Measuring the operational risk of Chinese commercial banks using the semilinear credibility model", Jing Lu, Lei Guo and Xing Liu discuss a very topical subject: an exponential exposure to operational risk in a rapid-growth environment, where investments in infrastructure never catch up with investments in business expansion. According to the authors, this is the case in the Chinese financial industry, where Chinese commercial banks face increasingly large exposures to operational risk because of the rapid development of the financial market and the wide application of new technologies. In order to cope with this problem, Chinese regulators have established stringent requirements to address these risks and to ensure reliable operational risk measurement and management. However, as usual, the inadequate quantity of internal loss data from banks affects the accurate measurement of operational risk. In this paper the authors combine commercial banks' data on internal operational risk loss with that on external loss to address the problem of the paucity of data, and they develop a method for setting the threshold for splitting the body of distributions from their tails using semilinear credibility theory, a widely employed theory in the non-life-insurance field, to address the problem of operational risk measurement. Following a recommendation from the editors of the journal, the authors also perform an empirical analysis based on loss data from 1990 to 2011 and identify the operational risk capital for large Chinese banks.

In our second paper, "Measuring risk with ordinal variables", Silvia Figini and Paolo Giudici propose a novel approach to measuring risk when the available data is expressed on an ordinal scale. As a result the authors obtain a new risk index bounded between 0 and 1, and this leads to a risk ordering that is consistent with a stochastic dominance approach. The proposed measure, being nonparametric, can be applied to a wide range of problems where data is ordinal and where a point estimate of risk is needed. In addition, the authors also provide a method for calculating confidence intervals for the proposed risk measure, in a Bayesian nonparametric framework, and they provide an example of the application of their approach through the use of a database provided by a telecommunications company.

FORUM PAPERS

In this section of the journal we publish papers that report day-to-day experiences in operational risk management. In this issue we have two papers in this section.

In our forum paper, "Alternative approaches to generalized Pareto distribution shape parameter estimation through expert opinions", Claudio Andreatta and Diego Mazza introduce methods for the estimation of the shape parameter of the generalized Pareto distribution through expert opinions obtained from self-assessment risk scenarios. The authors scale the difference between the historical loss experience and the subjective loss forecasts to come up with a more conservative shape parameter. This is a simple and practical paper that is likely to be of interest to many banks.

In the second forum paper, "Quantile distance estimation for operational risk: a practical application", Vincent Lehérissé and Alexis Renaudin tackle a very popular concern among practitioners: parameter estimation for severity distributions. Common techniques such as the maximum likelihood method or the generalized method of moments are often unreliable, presenting a significant challenge to operational risk modelers. This paper adapts an alternative method called quantile distance estimation, which is based on the minimization of a distance between empirical and theoretical quantiles to estimate severity distribution parameters. The authors also perform a study comparing quantile distance estimation to the common estimation methods that use real data sets, providing a very interesting discussion in the process.

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here