Journal of Operational Risk

Welcome to the fourth issue of the ninth volume of The Journal of Operational Risk.

This year has been a watershed year for operational risk, with large operational losses constantly making the headlines. Litigation follows a life cycle that takes between five and eight years to move from inception to a settlement or a court decision. Therefore, the lawsuits that originated in the financial crisis of 2007/8 are finally coming to a close. Because these lawsuits, in most cases, are allowed to be publicly divulged by the courts, everyone will know the settlement amounts between the parties. Because the amounts involved in these cases are so high, they have made headlines across the media.

The losses are obviously included in banks'operational loss databases and therefore create a tremendous challenge for modelers. Bearing in mind that the severity distributions of operational risk are already heavy-tailed, the inclusion of these extremely large settlements has caused a spike in regulatory capital for these firms. These losses were so important that they had an impact on the results of these large US firms,turning operational profits into losses on their quarterly earnings results. One good outcome from these settlements is that they have helped to bring the importance of operational risk management to the the attention of the boards of directors and executive management of these organizations. The future for operational risk might be different, and the status of operational risk managers within banks may well change. However, if these things are to happen, methodological advances in operational risk measurement will need to be able to assess the link between losses and manageable risk factors. We would be very happy to publish research on these new methods.

RESEARCH PAPERS
In this issue's one (very large!) research paper, "Estimating operational risk capital with greater accuracy, precision and robustness", John Douglas ("J.D.") Opdyke proves that, due to Jensen's inequality, even loss distributions that use unbiased estimators will generate biased regulatory capital. That is, operational value-at-risk always appears to be a convex function of the parameter estimates of these severities due to the fact that the quantiles being estimated are quite large and the severities are heavytailed. In Opdyke's opinion, the resulting bias means that capital requirements will always be overstated, and this extra cushion is often quite significant (sometimes even billions of dollars at the unit-of-measure level). Given these estimation challenges,the author presents an estimator of capital that essentially eliminates this upward bias when used with any commonly employed severity parameter estimator. The proposed reduced-bias capital estimator (RCE) would, consequently, be more consistent with regulatory intent regarding the responsible implementation of the LDA framework than other methods that fail to mitigate this bias. The RCE also arguably increases the precision of the capital estimate and consistently increases its robustness to violations of the independent and identically distributed data presumption (which are endemic in operational risk loss event data). So, with greater capital accuracy, precision and robustness, the RCE lowers capital requirements at both the unit-of-measure level and the aggregated enterprise level, increasing capital stability from quarter to quarter. The estimator presented in this paper is straightforward to implement using any major statistical software package.

FORUM PAPERS
In this issue we present two forum papers.

In the first, "A review of methods for combining internal and external data", Giuseppe Galloppo and Daniele Previati tackles the issue of a lack of a robust internal database in the calculation of economic capital in operational risk. He presents a review of the available models that mix internal and external data to calculate both severity and frequency distributions of operational losses. He shows that there are several approaches to tackling the issue, each of which presents both opportunities and limitations. His findings offer useful insights into enhanced risk practice and prudential supervision.

In the second paper forum paper, "A checklist-based weighted fuzzy severity approach for calculating operational risk exposure on foreign exchange trades under the Basel II regime", V. Sree Hari Rao and K. V. N. M. Ramesh apply fuzzy logic techniques to assess operational risk exposures on a foreign exchange trading desk.

Marcelo Cruz

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here