Welcome to the first issue of the seventh volume of The Journal of Operational Risk. As we start the seventh year of our existence, I need to start by thanking the extremely professional staff at Incisive Media for putting this journal out every quarter. Particular thanks are due to Jade Mitchell for her careful tracking of papers as they are transferred between the authors, the reviewers and me. Special thanks also go to the operational risk community for making us the number one forum in which to debate the key challenges that the industry faces. We have published nearly 100 articles: papers that are some of the best ever published in the area and that are frequently cited in other publications worldwide. This makes us all proud.
I urge potential authors to continue to submit papers to the journal in the area of the state of operational risk research. Again, I would like to emphasize that the journal is not solely for academic authors. Please note thats we do publish papers that do not have a quantitative focus, and indeed there is one example of this in the current issue. The Journal of Operational Risk would be happy to see more submissions containing practical, current views of relevant matters as well as papers focusing on the technical aspects of operational risk.
In this issue we have three Research Papers and one paper in the Forum section.
In the first paper, "Treatment of the data collection threshold in operational risk: a case study using the lognormal distribution", Alexander Cavallo, Benjamin Rosenthal, Xiao Wang and Jun Yan analyze the impact of the decision to set loss data collection thresholds and the estimation of truncated and/or shifted distributions. There is currently quite a heated debate in the industry, particularly in the United States, over whether shifted models result in biased parameter estimates based on the premise that the true model is known to be truncated and does not objectively assess shifted distributions. The authors shed some light on this issue in their very interesting paper.
In the second paper, a regular contributor to our journal, Eric W. Cope, gives us "Combining scenario analysis with loss data in operational risk quantification". This focuses on one of the hottest topics in operational risk of late and I am sure the paper will attract significant interest. The author models the underlying stochastic process that generates losses within a unit of measure as a superposition of various subprocesses that characterize individual loss generating mechanisms. These mechanisms use a nonparametric Bayesian framework for integration.A method is provided for identifying these mechanisms, performing scenario analysis and combining the outcomes with relevant historical data to compute the aggregate loss distribution for that unit of measure.
In the third paper, "A nonparametric approach to analyzing operational risk with an application to insurance fraud", Catalina Bolancé, Mercedes Ayuso and Montserrat Guillén defend the argument that nonparametric methods combine most of the advantages of parametric alternatives when measuring risk. They give a new method for addressing quantile estimation with no distribution assumptions using data from automobile insurance fraud. Interestingly, they conclude that fraud detection systems are effective at mitigating operational risk.
We present one paper in the Forum section of this issue: "Systemic operational risk: the UK payment protection insurance scandal" by Patrick McConnell and Keith Blacker. This paper is an interesting account, from an operational risk perspective, of the events that took place in 2011 when the UK High Court ruled against the British Bankers' Association in their appeal against the regulatory action concerning the misselling of payment protection insurance (PPI) products. Following the ruling, the four major UK banks announced provisions totaling over £6 billion to cover restitution to buyers of their PPI products. Some of the banks also decided to leave the PPI business. The authors argue that the losses incurred as a result of the PPI scandal were, for the most part, precipitated by systemic operational risk, particularly peoplerelated risks. Using examples from official inquiries, the paper identifies some of the people risks that went unmanaged in this part of the UK retail banking sector before the PPI market seized up in 2011. The paper then suggests proactive approaches to people risk management that should help to detect and minimize the impact of similar scandals in future. This topic is important, since the demographic shift toward longer periods of retirement and the prevalence of the "universal banking model" means that nontraditional banking products, such as insurance, pensions and investments, will increasingly be sold through banks, raising the possibility of further mis-selling scandals in future.
Treatment of the data collection threshold in operational risk: a case study using the lognormal distribution