Risk glossary
Risk glossary
Search for the definition you are looking for.
Value-at-risk (VAR)
Value-at-risk is a statistical measure of the riskiness of financial entities or portfolios of assets.
It is defined as the maximum dollar amount expected to be lost over a given time horizon, at a pre-defined confidence level. For example, if the 95% one-month VAR is $1 million, there is 95% confidence that over the next month the portfolio will not lose more than $1 million.
VAR can be calculated using different techniques. Under the parametric method, also known as variance-covariance method, VAR is calculated as a function of mean and variance of the returns series, assuming normal distribution. With the historical method, VAR is determined by taking the returns belonging to the lowest quintile of the series (identified by the confidence level) and observing the highest of those returns. The Monte Carlo method simulates large numbers of scenarios for the portfolio and determines VAR by observing the distribution of the resulting paths.
Despite being widely used, VAR suffers from a number of drawbacks. Firstly, while quantifying the potential loss within that level, it gives no indication of the size of the loss associated with the tail of the probability distribution out of the confidence level. Secondly, it is not additive, so VAR figures of components of a portfolio do not add to the VAR of the overall portfolio, because this measure does not take correlations into account and a simple addition could lead to double counting. Lastly, different calculation methods give different results.
Expected shortfall, an alternative risk measure, aims at mitigating some of VAR’s flaws.
Click here for articles on value-at-risk.