Journal of Operational Risk

Marcelo Cruz

Welcome to the fourth issue of the sixth volume of The Journal of Operational Risk. Just when we thought we were emerging from the economic slump,we were brought back to the reality that the crisis that began in 2007 will be with us for a long time yet.We experienced a partial recovery in 2009 but this has not been maintained in the years that have followed. The impact of this crisis in the banking industry has been remarkable. We read news every day that banks are cutting staff, reducing their risk taking and, therefore, lowering their returns.

This gloomy environment presents risk managers with many challenges, but also with a great opportunity to develop more robust risk-management frameworks. One thing is for sure: risk management will not be the same after these turbulent years.

The changes that are occurring are already palpable. For example, chief risk officers are now reporting to CEOs and/or have a direct reporting line to the board. This was not the case in most firms in 2007. Another indicator of the changes is the significant increase in time, depending on seniority and function, that risk managers spend with regulators. A risk manager can now spend up to 50% of his/her time in meetings (or preparing material for meetings) with regulators; in 2007, these meetings were quarterly andwould not represent more than a small fraction of the manager’s time.We would like to encourage potential authors to submit papers reporting these changes. In this issue we present some technical papers that effectively represent the current interests of the industry. Most papers that we receive are either about ways to merge the different types of inputs required in operational risk or about finding smart ways to calculate capital when the internal data has a few very large events and many low-severity events.

I ask potential authors not to feel discouraged from submitting to The Journal of Operational Risk because of the aforementioned crisis. I would also like to reemphasize that the journal is not solely for academics to publish in. We at The Journal of Operational Risk encourage readers to submit papers to the Operational Risk Forum section. This section is aimed at discussion of current events that does not concern itself it too much with technical aspects or with formulas and mathematics.We would be extremely happy to see more submissions with more practical, current views of relevant matters that affect day-to-day activities.

RESEARCH PAPERS


In the first paper, “An efficient threshold choice for the computation of operational risk capital”, Dominique Guégan, Bertrand K. Hassani and Cédric Naud tackle a data set issue that is commonly faced by operational risk modelers: that a few large events have a tremendous impact upon value-at-risk calculations. In order to account for these large effects they use extreme value distributions and propose a two pattern model (mixture) to characterize loss distribution functions associated with operational risks: a lognormal on the body of the severity distribution and a generalized Pareto distribution on the right tail. The optimal threshold at which the model switches from one distribution mode to the other is found using a bootstrap method that the authors develop and present to us here.

In the second paper, “A framework for uncertainty modeling in operational risk”, Tatiana Sakalo and Matthew Delasey try to give a different perspective on one of the most common themes of papers submitted to this journal: how to aggregate different types of data. The authors relax the assumption that we should calculate frequency and severity parameters separately by developing a framework that allows for the incorporation of uncertainty by expressing inputs as p-boxes. Evidence theory is employed to combine the information obtained from different sources, such as internal loss data (objective) and quantitative risk assessment (subjective). The modeled uncertainty provides bounds for the objective model.

In the third paper, we allow the author of a paper in the last issue of the journal to extend his views. In “Computing the value-at-risk of aggregate severities”, Henryk Gzyl observes the difficulties encountered in computing the distribution of a compound loss and then its value-at-risk. He gives a technique extended from his previous paper that allows explicit models to be used to compute the multidimensional Laplace transforms of all compound losses.

FORUM PAPERS


In “Leadership and high-reliability organizations: why banks fail”, the always controversial BrendonYoung considers the concept of high reliability, identifying its cultural aspect, and suggests that this could offer a way forward, particularly for systemically important financial organizations. Young asserts that cultural change and a commitment to “high reliability” are necessary factors in establishing a stable banking sector, and that regulation and legislation alone are not sufficient. This makes for interesting reading as legislators across the globe consider a new set of regulations to curb banks’ activities.

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here