Journal of Operational Risk
Editor-in-chief: Marcelo Cruz
Volume 11, Number 1 (March 2016)
Welcome to the first issue of Volume 11 of The Journal of Operational Risk. These are challenging days for operational risk modeling. The industry has been waiting since the end of 2015 for a Basel report that might radically transform the way that regulatory operational risk capital is calculated - or even scrap it altogether in exchange for a simple basic indicator approach (BIA) charge. Modelers and risk managers have been meeting in London and New York to discuss how to convince the regulators that this would be a mistake. As the leading publication in the area, The Journal of Operational Risk would like to be at the forefront of these discussions and we would welcome papers that shed some light on these issues.
This issue contains three technical papers and one forum paper. All three technical papers deal with economic capital aggregation issues, providing new techniques to deal with issues that arise in modeling.
In our first paper, "Evaluating operational risk by an inhomogeneous counting process based on Panjer recursion", José Alfredo Jiménez and Viswanathan Arunachalam propose a new approach for calculating operational value-at-risk (OpVaR) using an inhomogeneous counting process based on the Panjer recursion as the frequency distribution, and using generalized Pareto distributions and generalized extreme value theory to model the severities. They calculate the OpVaR using the inhomogeneous Panjer process method. They derive a closed-form expression for the moment-generating function for determining the aggregate loss distribution and then generalize well-knownclassical models and derive the statistical characteristics for modeling loss distribution. As we always prefer authors to do in the journal, an example to demonstrate the applicability of their proposed model is provided, and various special cases of this model are discussed.
In the issue's second paper, "A simulation comparison of quantile approximation techniques for compound distributions popular in operational risk", P. J. de Jongh, T. de Wet, K. Panman and H. Raubenheimer observe that most banks using the loss distribution approach (LDA) perform the modeling of the aggregate loss distribution for each operational risk type and business unit level as a cell in a matrix. Because of this, the firm-wide aggregate loss distribution would be a compound distribution resulting from a random sum of losses that are contained in these cells. In order to estimate the economic or regulatory capital for a particular cell, an extreme quantile of the aggregate loss distribution has to be estimated from the fitted severity and frequency distributions. Since a closed-form expression for the quantiles of the resulting estimated compound distribution does not exist, the quantile is usually approximated by using some type of brute force Monte Carlo simulation, which is computationally intensive. However, a number of numerical approximation techniques have been proposed to lessen the computational burden. Such techniques include Panjer recursion, the fast Fourier transform and different orders of both the single-loss approximation and perturbative approximation. Considering these challenges, the objective of this paper is to compare these methods in terms of their practicality and potential applicability in an operational risk context. The authors find that the second-order approximation, a type of closed-form approximation, performs very well at the extreme quantiles and over a wide range of distributions and is very easy to implement. This approximation can then be used as an input to the recursive fast Fourier algorithm to gain further improvements at the less extreme quantiles.
In the third paper, "A maximum entropy approach to the loss data aggregation problem", Erika Gomes-Gonçalves, Henryk Gzyl and Silvia Mayoral discuss the fact that one headache when determining operational risk regulatory capital relates to the computation of the distribution of losses when the underlying loss data consists of aggregated losses caused by different types of risk events in different business lines (ie, coming from different risk cells). The authors argue that when the loss data is carefully collected and well modeled - ie, when the losses are modeled as a joint vector - maximum entropy ("maxentropic") techniques are quite suitable for finding the probability density of the aggregated loss. When the data is not well modeled, the maxentropic procedure provides us with the marginal densities, which can then be aggregated by means of some appropriate copula. At any rate, the two possibilities hinge in an essential way on the maxentropic technique to determine a probability density from its Laplace transform. This is due to the fact that such techniques provide us with analytical expressions for the densities, which makes many numerical procedures easier to implement.
We have one forum paper in this issue: "Bank fraud and the macroeconomy" by Robert T. Stewart. He writes that he feels that the relationship between the macroeconomy and bank fraud losses has not received a lot of attention by researchers and claims that an understanding of this eventual link would allow banks to tailor resources appropriately when managing risks and performing stress tests. Despite anecdotal evidence suggesting that bank fraud tends to rise with economic activity and to peak at the top of the business cycle, there is limited research into the topic, most likely driven by difficulties in collecting fraud data. The author uses data from suspicious activity reports (SARs) - which financial institutions are mandated to provide to financial authorities - as a proxy for bank fraud and then performs an empirical correlation test between fraud and changes in gross domestic product (GDP). The author claims that his results suggest that a positive relationship exists between bank fraud and GDP growth, which could be extremely helpful in regulatory stress tests and overall fraud risk management.
Papers in this issue
Evaluating operational risk by an inhomogeneous counting process based on Panjer recursion
This paper proposes a new approach for determining OpVaR using an inhomogeneous counting process based on Panjer recursion as the frequency distribution.
A simulation comparison of quantile approximation techniques for compound distributions popular in operational risk
The objective of this paper is to compare numerical approximation techniques in terms of their practical usefulness and potential applicability in an operational risk context.
A maximum entropy approach to the loss data aggregation problem
This paper examines and compares alternative ways of solving the problem of determining the density of aggregate losses.
Bank fraud and the macroeconomy
This paper empirically tests for correlations between fraud and the macroeconomy.