Journal of Operational Risk

Marcelo Cruz


Welcome to the fourth issue of Volume 18 of The Journal of Operational Risk.

I have recently been talking to a number of operational risk executives and practitioners around the globe, as I often do. Since the Notice of Proposed Rulemaking (NPR) of the Basel III Endgame (commonly known as B3E) was issued by the US Federal Reserve, the Office of the Comptroller of the Currency and the Federal Deposit Insurance Corporation last July, the disappointment with the perspectives in operational risk has increased significantly. Among other revisions, the NPR proposes new capital requirements related to operational risk. The current requirement, based on the advanced measurement approach framework, requires banks to develop internal models to measure operational risk. However, the NPR eliminates the use of internal models for operational risk out of concern by the agencies that such models lack transparency and comparability: internal models present challenges for supervisors and market participants in assessing the relative magnitude of operational risk across banking organizations, the adequacy of operational risk capital, and the effectiveness of operational risk management practices. Several practitioners that were involved in quantification have either left the area or are considering leaving, meaning that financial institutions could even end the development of these models for calculating their internal economic capital. We would be very interested in receiving papers expressing readers’ views on this hot topic.

We are also interested in receiving more papers on the application of machine learning (ML) techniques – one of the current key interests in the industry. In addition to papers on ML and artificial intelligence (AI), we also welcome those on cyber and IT risks (not just on their quantification but also on better ways to manage them).We also aim to publish more papers on enterprise risk management (ERM) and everything this broad subject encompasses (eg, risk policies and procedures, implementing firm-wide controls, risk aggregation, revamping risk organization, internal audit). Analytical papers on operational risk measurement are always welcome but should ideally focus on stress testing and actually managing such risk. The Journal of Operational Risk, as the leading publication on operational risk measurement and management, is at the vanguard of discussions, and it welcomes papers that shed light on discussions relating to ERM and ML/AI as well as the NPR. These are certainly exciting times in OpRisk!



In the issue’s first paper, “Estimating the correlation between operational risk loss categories over different time horizons”, Maurice L. Brown and Cheng Ly focus on the issue of loss frequencies having different timescales (ie, daily, monthly or yearly) and, in particular, on estimating the statistics of losses over arbitrary time horizons. They present a frequency model whereby mathematical techniques can be applied to calculate means, variances and covariances that are arguably more accurate than those achieved by more time-consuming Monte Carlo simulations. The authors show that the analytic calculations of cumulative loss statistics in an arbitrary time window – which would otherwise be intractable due to temporal correlations – are feasible under their model. This very interesting paper has potentially useful applications, as these statistics are crucial for simulating loss correlations via copulas. Brown and Ly systematically vary all model parameters to demonstrate the accuracy of their methods for calculating all first- and second-order statistics of aggregate loss distributions. Finally, using combined data from a consortium of institutions, they show that, crucially, different time horizons can lead to a wide range of loss statistics that can significantly affect calculations of capital requirements.

In “Credible value-at-risk”, the second paper in this issue, Peter Mitic, a regular contributor to The Journal of Operational Risk, notes that some value-at-risk (VaR) calculation results appear to be extremely large and are often rejected on the grounds that they are inconsistent with the operational loss profile of an organization, effectively placing an informal de facto limit on the VaR. The concept and definition of a “maximum” VaR have hitherto rarely been considered. Mitic proposes an objective and simple process to determine whether or not a calculated VaR is “too large”, and he thereby defines a precise meaning of “too large” in this context. A simple decision process, using a constant multiplier of the annualized loss sum, is proposed to reject a distribution that produces an extremely high VaR. The decision process works in conjunction with a bootstrap to reject distributions that produce very low VaR values. Together, these processes form a way to determine whether or not a calculated VaR value is “credible”. A practical guide to using the combined procedures is given, along with a discussion of both their potential problems and their viable solutions.

Our third paper, “How does fintech affect the revenue and risk of commercial banks? Evidence from China”, finds Lixia Yu, Zhenghan Li and Liujue Li showing us that the development of financial technology (fintech) companies has given rise to a new competitive environment and has brought opportunities and challenges to the operations of commercial banks. To examine the impact of fintech on commercial banks’ revenue and risk, the authors collected data for 91 Chinese commercial banks of different types from 2013 to 2021, measured the fintech application index at the individual bank level using text mining methods and principal component analysis, and then constructed a fixed-effects panel regression model for empirical testing and heterogeneity analysis. Their results show that the application of fintech can effectively improve the revenue of commercial banks by bringing innovation to their business models and enriching their financial products. Further, the positive effect of fintech is more significant for national commercial banks than for regional ones, while the impact of the application of fintech on the risk of commercial banks also varies according to the type of bank, affecting regional banks more significantly than national banks.

In the issue’s fourth and final paper, “Legal risk management in the Polish banking sector”, Agnieszka Modras reviews how Polish banks manage legal risk, ie, what is required from banks to satisfy applicable regulations and management theories, and how this process works in practice. Since the global financial crisis of 2007–9, legal risk has become increasingly important for the banking sector. In Poland this increase is predominantly associated with the so-called regulatory tsunami, which has seen a constantly changing legal framework for bank operations as well as a significant number of customer claims that challenge the validity of foreign-currencybased mortgage loans. Legal risk management has not been covered extensively by academic research, and this study therefore makes a valuable contribution to the field, shedding new light on the topic and providing insights that can inform future research and policy decisions.

You need to sign in to use this feature. If you don’t have a account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here