Journal of Operational Risk

Marcelo Cruz

Chief Risk Officer, Aviva

With this issue of The Journal of Operational Risk we are moving into summer and it looks like the credit crunch crisis and its impact on financial markets is starting to subside after almost a year. Yet the question (within the industry) of how the real economy will be affected by this new environment still remains. A clearer picture will be available by the end of the season, when it is expected that oil prices, and their impact on inflation, will finally become more realistic.

Considering the whole situation, it has been a very busy year for risk managers from many points of view: most chief risk officers (CROs) and first tier senior risk managers have lost their jobs and those who stayed put are extremely busy with many requests from the board, investors and executive management. Risk managers will certainly never forget this period. We saw the impact of the credit crunch crisis in all types and forms: in volatilities never observed before; absolutely gigantic credit write-offs; and also one of the largest operational losses in history.Many lessons will certainly be learned during this period and we will welcome articles that comment on how risk managers deal with moments and situations such as this.

I would also like to express my excitement over the quality of the papers we have been receiving lately. This issue is a clear example of what I mean. We can notice that professors and practitioners with great reputations and a history in other related areas are starting to pay attention to operational risk and this has had an extremely positive impact in the overall quality of the papers submitted. In terms of research focus, it is clear now that practitioners are becoming more comfortable with loss distribution approaches and the awareness is changing about the aggregation of different types of data that is required to measure operational risk by including correlation in the picture.

Having said that, I would also like to incentivize readers to submit papers to the ‘Forum’ section. This section is aimed to generate a discussion of current events with less emphasis on the technical aspects, formulas and mathematics. I wish you a great summer and hope that the crisis continues its march towards the end so we can enjoy the hot weather without too many concerns.

Research Papers

In this issue we are being generous in presenting four papers of high quality. I would like to call the attention of the reader to the emergence of the use of copulas as a correlation/aggregation tool as one of the most researched subjects in operational risk. Böcker and Klüppelberg write about their research on the subject on “Modeling and measuring multivariate operational risk with Lévy copulas”. In this paper they handle the very practical issue of aggregating the result of operational risk capital for different event types/business lines across the firm.

They do this by using Lévy copulas to model the dependence of operational risk events. This approach is really interesting as it considers frequency and severity in the equation unlike other copula approaches that use either frequency or severity. In the second paper, “Aggregating operational risk across matrix structured loss data”, Embrechts and Puccetti study the problem of evaluating the overall risk framework in a matrix of random losses with some given probabilistic structure. Again copulas are used.

In the third paper, “Transform approach for operational risk modeling: value at- risk and tail conditional expectation”, Jang and Fu derive the analytical forms of the Laplace transform of the distribution of aggregate operational losses. They then apply value-at-risk (VaR) and tail conditional expectation (TCE; also known as TailVaR) methods to evaluate the operational risk capital charge. Unlike most practitioners that useMonte Carlo, the authors use fast Fourier transform to approximate VaR and TCE numerically and the figures of the distributions of aggregate operational losses are provided. They also provide numerical comparisons of VaR and TCE.

In the fourth paper, “Should risk managers rely on the maximum likelihood estimation method while quantifying operational risk?”, Ergashev compares the performance of four estimation methods, including maximum likelihood, that can be used in fitting operational risk models to historically available loss data. The other competing methods are based on minimizing different types of measure of the distance between empirical and fitting loss distributions. Ergashev shows that there are feasible alternatives to using the maximum likelihood estimation method.

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here