Welcome to the inaugural issue of The Journal of Operational Risk. The main focus of the Journal is research on the risk measurement and management of operational risk and to promote greater understanding of this new and fast growing area of risk. This Journal also aims at being an exclusive forum for discussions on this subject, as until now there has not been a dedicated forum for operational risk technical papers. Research in operational risk is a growth field in both the financial industry and academia. There are currently many lines of research, most of them trying to overcome the challenges presented by the new regulatory standards created by the Basel II Accord. However, currently there is not a single forum for the debate of these ideas. The Journal of Operational Risk intends to fulfill this much-needed role.
In this first issue, we present four research papers in the main section. In the first paper, Infinite-mean models and the LDA for operational risk, Nešlehová, Embrechts and Chavez-Demoulin discuss several important issues that financial institutions face in the calculation of the operational VAR. The authors touch on the subject of correlation within the VAR estimation; the potential use of ?-stable distributions, mixture models and particularly on some issues found in the use of extreme value theory and high quantile estimation. Readers using the “loss distribution approach” (LDA) for operational risk measurement should find this article particularly insightful.
In the second paper, Applying robust methods to operational risk modeling, Chernobai and Rachev show how robust estimation methods might be used to estimate operational risk statistical distributions. The authors highlight an interesting point that robust analysis is not about purely discarding the largest events (the outliers) but also understanding the impact that outliers have on the bulk of the data. They provide an example using operational risk data.
The third paper, Quantifying operational risk guided by kernel smoothing and continuous credibility: A practitioner’s view by Gustafsson, Nielsen, Pritchard and Roberts provides a practical, step-by-step approach to the application of nonparametric smoothing techniques compared with parametric techniques in using extreme value theory, also using credibility theory techniques in the aggregation.
In the fourth paper, Modeling insurance mitigation on operational risk capital, Bazzarello, Crielaard, Piacenza and Soprano incorporate the calculation of insurance in the calculation of the operationalVAR using the LDA. Their model considers the effects of insurance coverage on individual loss events, deemed as a reasonable proxy of the insurance coverage, as it takes into account individual insurance contract conditions, deductibles and limits. The authors also take into account stochastic factors such as payment uncertainty and counterparty default risk in the calculation.
Operational Risk Forum
This section is intended to provide a less formal forum on findings and ideas about operational risk without the academic rigor demanded in the main section. The mission of the Forum is to promote active discussions of current issues in operational risk. Articles submitted to this section should preferably not exceed 8,000 words. Contributions to the Forum can be articles that seek to explain difficult, unclear but otherwise known concepts and results. The articles that we would like to see in the Forum should be designed to be tutorial and highly educational in nature. The main goal of the submitted articles is to bring a higher level of understanding to both industry and academia on issues and topics that might not normally be readily and easily accessible to either side.
In this first issue, Jay Jhaveri has written a piece on the important issues of anti-money laundering and know-your-customer in the area of M&A and VC/PE. This provides a view of how to tackle such risks and provides a list of key risk indicators to be verified as well as an interesting discussion on the subject.
Quantifying operational risk guided by kernel smoothing and continuous credibility: A practitioner's view