Journal of Operational Risk
ISSN:
1744-6740 (print)
1755-2710 (online)
Editor-in-chief: Marcelo Cruz
Volume 5, Number 4 (December 2010)
Editor's Letter
Marcelo Cruz
Welcome to the Welcome to the fourth issue of the fifth volume of The Journal of Operational Risk. With this issue we close the fifth year of The Journal of Operational Risk. This has been a particularly interesting year as discussions in the risk community centered around the possible economic recovery from the great recession, and whether this feeble recovery would last. It is still questionable when we are going to emerge from this economic mess, and this uncertainty brings considerable volatility to the markets, which increases the overall risk.
On the regulatory side, Basel III is starting to become a reality now. The measures involved have been agreed by world leaders and a Basel Committee consultative paper is due out any day now. This new set of rules significantly increases the capital that a bank should hold. However, institutions will have a reasonably long time to implement these rules, and this gives banks time to prepare and capitalize. Although these Basel III rules do not explicitly impact operational risk, we are likely to have a much more risk-averse, back-to-basics banking system in the next few years. Quite a few large investment banks have announced that they are to close their proprietary desks, and these prop traders are now flocking into hedge funds, either creating new ones or going to more established hedge funds. It is interesting to see that risk taking is not dead, just trading places.
I would like to encourage authors to continue submitting to the journal.We at The Journal of Operational Risk would be happy to see submissions containing practical, current views of relevant matters, as well as papers focusing on the technical aspect of operational risk. Again, I would like to emphasize that the journal is not solely intended for academic authors.
RESEARCH PAPERS
In this issue, we bring you four technical papers. These papers are very diverse and deal with the varying challenges that the industry faces these days. However, a common thread that at least three of these papers tackle is the issue of running time. As firms are making constant progress with data collection, they are learning the complexities of running operational risk measurement systems with many hundreds of thousands of data points. They are also learning where analysts can best use diverse models for frequency and severity and also how to aggregate these distributions. Running time is quickly becoming an issue for most firms and making sure that the best result is achieved using the fewest possible resources shows that our industry is entering a new phase. To exemplify this point, two of the papers in this issue deal with novel ways of handling aggregation of frequency and severity distributions. This aggregation can be time-consuming from a computational standpoint. Quite often, firms use parallel computing to run these simulations more quickly. In this issue, one author suggests a single-loss approximation and another group of authors suggest fast Fourier transforms as possible ways of increasing speed. Another paper suggests a method that allows analysts to select the least number of fitting distributions possible. It is interesting to see operational risk moving to this new phase.
In the first paper, “The calculation of minimum regulatory capital using singleloss approximations”, Matthias Degen provides an analytical framework to assess the accuracy of these approximations and, subsequently, derives an improved and faster method for the calculation of minimum regulatory capital charges.
In the second paper, “Recursions and fast Fourier transforms for certain bivariate compound distributions”, Tao Jin and Jiandong Ren consider three classes of bivariate counting distributions, and their corresponding compound distributions, to implement recursive methods for computing the joint probability functions of the bivariate compound random variables. They then compare the results with those obtained from fast Fourier transform methods. The authors show that with appropriate analysis, the fast Fourier transform method can be a viable alternative to the recursive method for computing joint probabilities. The authors are nice enough to present us with numerical examples to corroborate their analysis.
In the third paper, “Modeling operational loss severity distributions from consortium data”, one of our regular contributors, Eric Cope, describes a statistical methodology for identifying and characterizing the universe of loss severity distributions in a consortium loss database (ORX), using as few distributional models as possible. His procedure is based on successively clustering and pooling losses of various types according to different observed degrees of distributional homogeneity. The author allows for the possibility of scaling loss data according to a simple linear transformation in order to bring the distributions into alignment. He addresses various estimation challenges in dealing with operational risk data, and describes the results of applying this methodology to ORX.
In the fourth paper, “Operational risk quantification: a risk flowapproach”, Gandolf Finke, Mahender Singh and Svetlozar Rachev (a member of the journal’s editorial board) provide a view on operational risk measurement in non-financial firms. We always welcome approaches that use data from other industries. The authors extend the idea of overlaying the three flows in a manufacturing company (materials, financial and information) with the risk flow (containing key risk factors) to assess operational risk. They demonstrate the application of a risk flow concept in operational risk by discussing a case study within a consumer goods company. They implement the model using discrete-event and Monte Carlo simulation techniques. Results from the simulation are evaluated to show how specific parameter changes affect the level of operational risk exposure for this company.
.