Demystifying the practice of synthetic CDO valuation
Nosheen Khan, vice president, structured credit valuations at Markit, examines the base correlation mapping methods for implying bespoke collateralised debt obligation (CDO) correlations from standard index tranche correlations
A collateralised debt obligation (CDO) involves creating a portfolio of fixed-income securities or credit derivatives and repackaging the underlying risk of the portfolio by partitioning it into tranches. The tranches are created to suit the risk-return profiles of a wide range of investors. In cash CDOs, the portfolio of underlying securities consists of cash instruments, that is, bonds, loans, asset-backed securities and so on; these constitute the asset side of the structure. The liability side consists of the tranched notes, issued to investors.
With the evolution of a liquid market in credit derivatives, synthetic CDOs have gained significant investor interest. A synthetic CDO is structurally similar to the cash CDO described above, except that the assets are a portfolio of credit default swaps (CDS), and there is no initial physical investment in a security, just the execution of a bilateral credit derivatives contract. Figure 1 (on page 31) shows the structure of a typical synthetic CDO. The seller of protection on the senior mezzanine tranche will make payments only when losses on the portfolio exceed 15%. The tranche loses all its value when losses exceed 30% at which point the losses start hitting the senior tranche.
Figure 2 shows the payout structure for each tranche. It is apparent that CDO tranches provide option-style payout profiles, which is one of the reasons why they are appealing to investors and have been one of the fastest growing structures in the credit derivatives market. Nevertheless, this poses interesting modelling and pricing challenges.
From vanilla to exotic correlation structures
The launch of standardised credit derivatives indices; iTraxx and CDX, in 2004, was one of the biggest catalysts in opening up the credit correlation market. The indices were structured to reflect the 125 most liquid corporate names, across the major sectors in the investment-grade universe. Market participants started trading standardised tranches on North American investment-grade and high yield, and European investment-grade credit derivatives indices, in late 2004. This spurred increased issuance in bespoke synthetic CDOs as the correlation risk could be hedged in the standard tranche market.
Tranches on bespoke synthetic CDOs are similar to standard index tranches, except that the portfolio and tranche risk profiles can be selected to suit specific investor requirements. Unlike cash CDOs, the ease of hedging synthetic CDO deals led to growth in the single-tranche CDO market, where the investor agrees with the dealer the underlying entities it would like exposure to, as well as agreeing on the tranche that would provide the specific risk, return and rating the investor desires.
The interesting convexity, roll-down and dispersion characteristics, along with different correlation sensitivities across the capital structure and term-structure, has created relative-value opportunities in the tranche market and has helped create a standard liquid traded universe.
More recently, participants have been using the standard CDO technology to trade:
A. Structures with different risk sensitivities such as zero-coupon tranches, tranchlets, leveraged super-senior tranches, tranches on barbell portfolios, tranches with optionality and step-up features, long-short CDOs and CDO-squareds;
B. Strategy-based products such as constant proportion portfolio insurance and constant proportion debt obligation;
C. Correlation in other asset classes, using products such as collateralised foreign-exchange obligation, collateralised commodity obligation and CDOs with equity default swaps underlying;
D. Hybrid structures such as inflation-linked CDOs.
The one-factor Gaussian Copula
The challenge of pricing CDOs is modelling the joint behaviour of defaults of multiple entities and finding models to assign meaning to the term 'correlation'.
First into the market, the Gaussian Copula model is still the most widely used model for pricing CDO tranches. Simply put, a copula is a mathematical function that provides the cumulative loss distribution of the portfolio, given the default probabilities (implied from the CDS curves) of the entities that constitute the portfolio and an assumption on correlation between defaults. The Gaussian Copula model constructs a multivariate version of the original Merton model, the underlying theory being that a firm defaults when the value of the firm's assets (following a stochastic process) falls below its liabilities.
The main assumptions underlying the model are:
1. The value of the firm's assets are normally distributed.
2. The correlations in default times are assumed to be equal and constant across all entities in the bespoke portfolio.
3. The risk/probability of default, of every entity in the portfolio can be assumed to have two components; namely market risk and idiosyncratic risk. It further assumes that the market risk can be represented by a single market factor (hence 'one factor') and that every entity in the portfolio has the same correlation with the market factor.
This is represented by the following relationship:

Where Xi is the default probability of entity, i. M is the single market risk factor which is a normally distributed random variable. Zi is the idiosyncratic risk factor for entity i. All Zi's are mutually uncorrelated and normally distributed random variables. r = constant pair-wise correlation between default times and takes values between -1 and 1.
These assumptions make implementation of the model fairly straightforward in both semi-analytic and Monte Carlo frameworks. For simplicity, the model is usually run with a single correlation value across all pairs of entities, so tractability and familiarity are the main reasons why it has emerged as the benchmark.
The base correlation framework
Simply put, base correlation is the default correlation implied by the price of a base tranche, that is, a tranche with an attachment point of 0. All non-base tranches can be priced as a combination of a long and short position in two base tranches. For example:

Where ELx - y is the expected loss of a tranche with attachment x and detachment y. rx is the base correlation of a tranche with detachment x.
The price of the 0-6 tranche can be calculated from the market prices of the 0-3 and 3-6 tranches. This price can then be used to imply the base correlation for the 0-6 tranche. Hence, it is possible to bootstrap the correlation skew across the capital structure by constructing a series of base tranches using market prices for liquid tranches.
Prior to the wide scale acceptance of base correlations for quoting index tranches, the market's first approach was known as 'implied correlation'. Implied correlation was unsuccessful due to the insensitivity of certain mezzanine tranches to correlation, which meant that there could be multiple implied correlations resulting in the same tranche price, or no implied correlations.
The equity tranche price moves inversely proportional to correlation, that is, higher correlation implies lower probability of correlated defaults occurring earlier in time and hence a lower tranche price. The opposite is true for senior tranches ie, higher correlation implies higher prices. This implies that one or more mezzanine tranches exist that may have little or no sensitivity to correlation. Hence, base correlation is the preferred method for implying correlations as, by definition, it implies an upward-sloping skew that is consistent with market prices.
Mapping methodologies
The one-factor, Gaussian Copula model simply provides a framework to imply a correlation skew from standard liquid index tranche prices. However, pricing bespoke tranches in this framework requires mapping techniques to allow correlations calibrated in the liquid tranche market to be applied to bespoke portfolios.
Mapping on the risk dimension
In order to determine the implied base correlation for a bespoke tranche, one needs a function to map the risk of the bespoke tranche to that of standard index tranches. Below are a few mapping techniques introduced in the past couple of years, none of which are market standard but represent a few different dimensions for equating bespoke tranches to index tranches.
At-the-money (ATM) matching
The theory underlying this methodology is that the risk of a tranche is measured in terms of the tranche strike K as a proportion of the total portfolio expected loss (PEL). Thus, two tranches are considered equivalent if they have the same risk relative to their respective portfolios. This is represented as follows:

Probability matching
This mapping involves finding the index tranche that has the same probability of default P as the bespoke tranche.
The underlying theory is that two tranches are considered equivalent if they have the same probability of being wiped out. Since a change in correlation p does not change the portfolio expected loss but merely redistributes the loss across the capital structure, comparing tranche loss probabilities captures this effect.

Equity spread matching/senior spread matching
This mapping requires iteratively solving for the index equity tranche that has the same spread as the bespoke equity tranche, given a correlation assumption. Calibrating the correlation itself requires getting the equivalent tranche.
The methodology for senior spread matching is the same as that for equity spread matching, except that it involves solving for the equivalent senior tranche instead of the equity tranche.
Tranche loss proportion mapping
The percentile loss of the bespoke - tranche expected loss (TEL), divided by PEL - is mapped onto the same percentile loss on the index. This method involves iteratively solving for an index detachment point that equates the percentile loss of the bespoke and the index.

Correlation risk mapping
The underlying theory here is that if two tranches have the same sensitivity to correlation, then they are equivalent and should be priced using the same correlation assumption. The ATM matching methodology is easy and fast in terms of implementation. However, it does not take into account dispersion and is discontinuous in the case of default.
Although, the probability matching method is iterative and the loss distribution is discontinuous in case of deterministic recovery assumptions, it does take into account dispersion and is continuous in case of defaults and hence is one of the more preferred methods for mapping bespoke tranches. The mapping methods are most accurate when the bespoke risk characteristics are not significantly different from the index.
Interpolating and extrapolating the skew
When pricing non-standard strikes, one may be required to interpolate or extrapolate on the skew implied by standard strikes. The choice of interpolation method can have a significant impact on tranche pricing.
Using linear interpolation on the base correlation skew can give rise to arbitrage opportunities since the skew is not continuous. Discontinuity in the skew gives rise to discontinuity in the cumulative expected loss profile, resulting in senior tranches having expected losses higher than or equal to those of the junior tranches, hence giving rise to arbitrage opportunities. For non-arbitrage pricing across the capital structure, the cumulative loss function should be concave and increasing.
Figure 3, on page 31, presents the cumulative loss function for iTraxx Series 7 5Y as of 31 May, 2007, based on linear interpolation on the correlation skew.
Figure 4 shows the magnified chart for the cumulative loss distribution in the 7%-22% region.
The discontinuity in the 11%-13% cumulative loss region implies a higher expected loss in the 12%-13% tranche as compared to the 11%-12% tranche, resulting in the 12%-13% tranche being priced at 4.92bps as compared with 2.98bps for the 11%-12% tranche.
Figure 5 shows the magnified chart for the cumulative loss function using spline interpolation, which is both concave and increasing and hence provides arbitrage-free pricing.
The equity tranchlet market was liquid briefly in early 2006, but has not seen much progress since.
Participants use different extrapolation methodologies below the standard 3%, which may have a significant impact on the bespoke equity tranche pricing (since most bespokes tend to be riskier than the index and hence their equity tranches map below 3% on the standard index skew).
The final graph on page 31 shows various views in the market regarding extrapolating the skew below 3%.
We see that, being at either extreme end of the tranchlet skew would result in a significant difference in pricing 0-1, 1-2 and 2-3 tranchlets, since the riskiest tranches are most sensitive to correlation assumptions.
Mapping on the maturity dimension
Bespoke portfolios may not necessarily have the same maturity as the index and one needs to be able to imply a skew for the bespoke maturity based on the observed index skews for standard maturities.
Interpolating on maturity
The bespoke portfolio is mapped independently, to the two adjacent index maturities, and interpolating the mapped implied skews from the two maturities gives the skew for the bespoke maturity. This is easy to implement and the motivation behind this is that, a six-year bespoke portfolio would have skew characteristics of both the five-year and seven-year skews, and interpolation on expected loss or correlations can capture these dynamics.
Term structure of correlation
In order to consistently price longer-dated tranches and tranches with non-standard maturities, by taking into account the evolution of spreads and expected losses over time, we bootstrap the correlation skews for longer maturities using the prices and correlation skews for the short maturities ie, when calculating the term-structure skew of the seven-year tranche, the expected loss of the seven-year tranche is calculated as the sum of:
1. The expected loss of the five-year tranche implied from the five-year flat skew.
2. The expected loss of tranche in year six implied by the average of the five-year flat skew and the seven-year term-structure skew.
3. The expected loss of the tranche in year seven implied by the seven-year term-structure skew.
The tranche price implied by the loss function, in the table above, is equated to the market price of the seven-year tranche, by solving for the seven-year term-structure correlation. Once the term-structure skews are calibrated at all maturities, one can simply use these surfaces bootstrapped to the bespoke maturity, in order to price non-index maturities.
Mapping on region
Most bespoke portfolios tend to have both European and North American names. The two markets have different implied skews reflecting the difference in underlying market dynamics. In order to accurately price global bespoke tranches, one needs to combine the mapped skew implied by each market.
Separate weighted average method
This involves mapping the bespoke tranche independently to the iTraxx and CDX skews and using a weighted average of the two mapped skews, to imply a skew for the bespoke tranche.
This is easy to implement but the main drawback of this approach is that it maps the bespoke to iTraxx, assuming the bespoke is 100% European, and then maps the bespoke to CDX assuming it is 100% North American.
As a result, the iTraxx and CDX strikes equivalent to the bespoke strike are not equivalent to each other.
Joint weighted average method
Using the probability matching approach as an example, the joint weighted average method involves solving jointly for the equivalent iTraxx and CDX strikes using the weighted average iTraxx and CDX correlations corresponding to the equivalent strikes. The default probabilities of the 0-3 bespoke tranche mapped to CDX and iTraxx individually are:
We calculate

Where a is the weight of iTraxx in the bespoke, such that

This is calculation intensive and requires a multivariate solver, but has marginal benefit when mapping to index skews that are relatively close to each other.
The two methods diverge when one is mapping to two significantly different portfolios in terms of expected loss and implied correlation skews that is, IG and HY.
and in Conclusion...
Agreement on the model is one piece of the puzzle and selecting an appropriate mapping methodology across risk, region, skew and maturity dimensions is the other.
This still leaves us with some open questions, namely:
1. The choice of index series to be used for mapping, given reasonable liquidity in off-the-run index tranches.
2. Quantifying the effect of dispersion on correlation, in order to price bespokes with higher dispersion relative to the index.
3. Taking into account the difference in the number of entities between the bespoke and the index.
This has gained importance recently, with the iTraxx Xover (50 entities) skew implied from the CDX HY (100 entities) skew.
4. Baskets with higher sector concentration than indices need to be priced off higher correlations than those implied by the index skews.
5. Pricing bespoke tranches with short buckets.
6. Taking into account supply-demand and liquidity characteristics dictated to some extent by bespoke issuance, the risk appetite of the buy-side and which way round the buy-side market is.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe
You are currently unable to print this content. Please contact info@risk.net to find out more.
You are currently unable to copy this content. Please contact info@risk.net to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@risk.net
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@risk.net
More on Credit markets
Liquidnet sees electronic future for grey bond trading
TP Icap’s grey market bond trading unit has more than doubled transactions in the first quarter of 2024
Single-name CDS trading bounces back
Volumes are up as Covid-driven support fuels opportunity for traders and investors
Podcast: Richard Martin on improving credit migration models
Star quant proposes a new model for predicting changes in bond ratings
CME to pass on Ice CDS administration charges
Clearing house to hike CDS index trade fees from July after Ice’s determinations committee takeover
Buy side fuels boom in single-name CDS clearing
Ice single-name CDS volumes double year on year following switch to semi-annual rolls
Ice to clear single-name bank CDSs from April 10
US participants will be able to start clearing CDSs referencing Ice clearing members
iHeart CDS saga sparks debate over credit rules
Trigger decision highlights product's weaknesses, warns Milbank’s Williams
TLAC-driven CDS index change tipped for September
UK and Swiss bank Holdco CDSs likely inclusions in next iTraxx index roll, say strategists