Journal of Operational Risk

This is a special issue for The Journal of Operational Risk as we celebrate our tenth anniversary. In this milestone event for our young journal I think that it would be appropriate at this time to provide some history on how we started out. This journey was not always easy.

Back in 2002 and 2003 Matthew Crabbe and I held a number of meetings in London in which we discussed the dearth of publications dedicated to operational risk, which at that time had yet to build its credibility as a risk discipline and was seen as a "qualitative" sister to market risk and credit risk: much less relevant than the latter well-established fields. Operational risk articles submitted to more traditional financial journals were systematically rejected, not because of the quality of the papers but because the subject of operational risk was not yet mature. Matthew and Incisive Media were courageous enough to take on the idea of creating the journal and gave me the opportunity to lead this risky venture of establishing a journal fully dedicated to operational risk, which at the time was not even formally sanctioned by the Basel Committee with capital requirements and minimum standards; Basel II was only signed later on in 2004.

Over these ten years we have published over 150 papers: more than 4500 pages of top-notch technical material on operational risk.We have received 400C papers that did not meet our rigorous publication standards. Pretty much all the leading minds in operational risk, whether academic or practitioner, have been involved in the journal, either by publishing a paper, by reviewing material and/or by providing guidance and direction. I am very proud of our editorial board, from whom I have constantly sourced valued advice. Among these are a president of the Federal Reserve Bank, deans of prestigious universities such as Harvard and Cambridge, senior professors and researchers in top colleges, and chief risk officers and heads of operational risk.The strength and success of The Journal of Operational Risk are a result of all this talent.

In addition to the papers we have published, The Journal of Operational Risk has, over the years, sponsored some of the best technical conferences on operational risk that I have had the honor to take part in. The objective of the journal was always to take operational risk to the next level by fomenting discussion of new ideas. Over the years we have had a number of banks publish their operational risk measurement methodologies, allowing technical experts to comment even before regulators had examined them; we have also published overviews of the status of the operational risk industry in many emerging countries, showing just how far the journal reaches worldwide.

We can only hope to continue this journey of facilitating the progress of the operational risk industry. There is still a lot to be done in the area, but these challenges only make this community more energized. Thank you to all our readers and contributors! We look forward to the next ten years.

To further celebrate this milestone we have published a special online edition of The Journal of Operational Risk. This virtual issue looks back at some of the papers we have published over the last ten volumes; all of which are free to access until the end of April 2015. Click here to access the issue.

In this issue we have three research papers and one forum paper. In our first paper, "Modeling correlated frequencies with application in operational risk management", Andrei L. Badescu, Lan Gong, X. Sheldon Lin and Dameng Tang propose a copula-free approach to modeling correlated frequency distributions using an Erlang-based multivariate mixed Poisson distribution. They investigate some of the properties attributed to this class of distributions and derive a tailor-made expectationmaximization algorithm for fitting purposes. The applicability of the proposed distribution is illustrated in an operational risk management context, where this class of distribution is used to model operational loss frequencies and their complex dependence structure in a high-dimensional setting. Furthermore, by assuming that operational loss severities follow a mixture of Erlang distributions, the approach leads to a closed-form expression for the total aggregate loss distribution, and its value-at-risk (VaR) can be calculated easily by any numerical method. The efficiency and accuracy of the proposed approach are analyzed using a modified real operational loss data set.

In the issue's second paper, "Combining scenario and historical data in the loss distribution approach: a newprocedure that incorporates measures of agreement between scenarios and historical data", P. J. de Jongh, T. de Wet, H. Raubenheimer and J. H. Venter discuss the fact that most banks use the loss distribution approach in their advanced measurement models to estimate regulatory or economic capital. This boils down to estimating the 99.9% VaR of the aggregate loss distribution, which is very difficult to do accurately given how far in the tail it is. Also, it is well-known that the accuracy with which the tail of the loss severity distribution is estimated is the most important driver in determining a reasonable estimate of regulatory capital. To this end, banks use both internal data and external data (jointly referred to as historical data) as well as scenario assessments in their quest to improve the accuracy with which the severity distribution is estimated. This paper proposes a simple new method whereby the severity distribution may be estimated using historical data and experts' scenario assessments jointly. The way in which historical data and scenario assessments are integrated incorporates measures of agreement between these data sources, which can be used to evaluate the quality of both. In particular, we show that the procedure has definite advantages over traditional methods where the severity distribution is modeled and fitted separately for the body and tail parts, with the body part based only on historical data and the tail part on scenario assessments.

In the third paper in the issue, "Improved goodness-of-fit measures", Peter Mitic describes new goodness-of-fit (GoF) measures that represent significant improvements over the current standard GoF tests. Traditional GoF measures use the intuitive geometrical concept of the area enclosed between the curve of a fitted distribution and the profile of the empirical cumulative distribution function to verify the potential fit of a distribution to the data. A transformation of this profile simplifies the geometry and provides three new GoF tests. The integrity of this transformation is justified by topological arguments. These new tests provide theoretical grounds for the inclusion of qualitative judgements on GoF tests and provide a workable way to objectively choose a best-fit distribution from a group of candidate distributions.

In the issue's only forum paper, "An assessment of the efficiency of operational risk management in Taiwan's banking industry: an application of the stochastic frontier approach", Hsiang-Hsi Liu and Mauricio Cortes discuss the importance of operational risk management to the efficiency of Taiwanese banks and try to demonstrate that, by applying risk managerial strategies, banks can improve their performance, soundness and resilience, measured by their risk-adjusted return on capital. In order to perform this study, a stochastic frontier analysiswas applied toTaiwanese bank data from 2008 to 2010. The authors' findings reflect the fact that supervisory review and market discipline improve banks' operational efficiency, while leniency when it comes to governance risk management compliance and opacity boost technical inefficiency.

Marcelo Cruz

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here