**Portfolio Probe » R language**, and kindly contributed to R-bloggers)

An introduction to estimating Value at Risk and Expected Shortfall, and some hints for doing it with R.

## Previously

“The basics of Value at Risk and Expected Shortfall” provides an introduction to the subject.

## Starting ingredients

Value at Risk (VaR) and Expected Shortfall (ES) are always about a portfolio.

There are two basic ingredients that you need:

- The positions within the portfolio
- A history of prices for the assets involved

With these you can derive two less basic ingredients:

- The portfolio weights come from the positions and the current prices
- The asset return history can be found from the price history

These can be used to estimate market risk. There are other risks — such as credit risk — that may not be included in the price history.

## Multivariate estimation

VaR and ES are each a single risk number at the portfolio level while we are starting at the asset level. One approach is to estimate a variance matrix of the asset returns and then use the portfolio weights to collapse to the portfolio variance.

This is most likely to be done when it is desired to see the sources of risk rather than just have a single number.

## Univariate estimation

Estimation is simpler with a single time series of returns for the portfolio — the portfolio as it is now.

We can get this by matrix-multiplying the matrix of simple returns of the assets in the portfolio by the portfolio weights. In R notation this is:

R1 <- assetSimpRetMatrix %*% portWts

or safer would be:

R1 <- assetSimpRetMatrix[, names(portWts)] %*% portWts

Note that this is similar to the computation that was warned about in “An easy mistake with returns”. But in this case we don’t want the returns of a real portfolio — we want the hypothetical returns of our portfolio as it now exists.

The `R1`

object computed above holds the (hypothetical) simple returns of the portfolio. Modeling is often better with log returns. You can transform from simple to log returns like:

r1 <- log(R1 + 1)

There are additional choices, of course, but some common methods are:

- historical (use the empirical distribution over some number of the most recent time periods)
- normal distribution (estimate parameters from the data) and use the appropriate quantile
- t-distribution (usually assuming the degrees of freedom rather than estimating them)
- fit a univariate garch model and simulate ahead

## R Implementations

Here is an incomplete guide to VaR and ES in the R world.

My search for R functionality was:

- Use Rseek to do appropriate searches
- Look in the Finance task view

### PerformanceAnalytics

This package has functions `VaR`

and `ES`

. They take a vector or matrix of asset returns. Component VaR, marginal VaR and component ES can be done. Distributions include historical, normal and a Cornish-Fisher approximation.

Here are some examples where `spxret11`

is a vector of the daily log returns of the S&P 500 during 2011. So we are getting the risk measure (in returns) for the first day of 2012.

> VaR(spxret11, method="historical") [,1] VaR -0.02515786 > VaR(spxret11, method="gaussian") [,1] VaR -0.0241509 > VaR(spxret11, method="gaussian", p=.99) [,1] VaR -0.03415703 > ES(spxret11, method="historical") [,1] ES -0.03610873 > ES(spxret11, method="gaussian") [,1] ES -0.03028617

If the first argument is a matrix, then each column can be thought of as an asset within the portfolio. This is illustrated with some data from the package:

> data(edhec) > VaR(edhec[, 1:5], portfolio_method="component") no weights passed in, assuming equal weighted portfolio $MVaR [,1] [1,] 0.02209855 $contribution Convertible Arbitrage CTA Global 0.0052630876 -0.0001503125 Distressed Securities Emerging Markets 0.0047567783 0.0109935244 Equity Market Neutral 0.0012354711 $pct_contrib_MVaR Convertible Arbitrage CTA Global 0.238164397 -0.006801916 Distressed Securities Emerging Markets 0.215252972 0.497477204 Equity Market Neutral 0.055907342

### Package actuar

This package also has a `VaR`

function that works with a special form of distribution objects.

### Doing it yourself

There is not very much functionality available in R for Value at Risk and Expected Shortfall probably because it is extremely easy to do whatever you want yourself.

**Warning**: none of the functions given below have been tested. There is a reasonably high probability of bugs.

The functions are placed in the public domain — you are free to copy them and use them however you like.

#### historical

Here is a definition of a simple function for historical estimation of Value at Risk:

VaRhistorical <- function(returnVector, prob=.05, notional=1, digits=2) { if(prob > .5) prob <- 1 - prob ans <- -quantile(returnVector, prob) * notional signif(ans, digits=digits) }

This is used for a 13 million dollar portfolio like:

> VaRhistorical(spxret11, notional=13e6) 5% 330000

The expected shortfall is barely more complicated:

EShistorical <- function(returnVector, prob=.05, notional=1, digits=2) { if(prob > .5) prob <- 1 - prob v <- quantile(returnVector, prob) ans <- -mean(returnVector[returnVector <= v]) * notional signif(ans, digits=digits) }

This can be used like:

> EShistorical(spxret11, notional=13e6) [1] 470000

So the Value at Risk is $330,000 and the Expected Shortfall is $470,000.

#### normal distribution

There’s a better (in a statistical sense) version later, but here is a simple approach to getting Value at Risk assuming a normal distribution:

VaRnormalEqwt <- function(returnVector, prob=.05, notional=1, expected.return=mean(returnVector), digits=2) { if(prob > .5) prob <- 1 - prob ans <- -qnorm(prob, mean=expected.return, sd=sd(returnVector)) * notional signif(ans, digits=digits) }

This is used like:

> VaRnormalEqwt(spxret11, notional=13e6) [1] 310000 > VaRnormalEqwt(spxret11, notional=13e6, + expected.return=0) [1] 310000

Computing the Expected Shortfall in this case is slightly complicated because we need to find the expected value of the tail. Numerical integration works fine for this.

ESnormalEqwt <- function(returnVector, prob=.05, notional=1, expected.return=mean(returnVector), digits=2) { if(prob > .5) prob <- 1 - prob retsd <- sd(returnVector) v <- qnorm(prob, mean=expected.return, sd=retsd) tailExp <- integrate(function(x) x * dnorm(x, mean=expected.return, sd=retsd), -Inf, v)$value / prob ans <- -tailExp * notional signif(ans, digits=digits) }

The result for our example with this is:

> ESnormalEqwt(spxret11, notional=13e6) [1] 390000

A **much** better approach that is still quite simple is to use exponential smoothing to get the volatility (as the original RiskMetrics did):

VaRnormalExpsmo <- function(returnVector, prob=.05, notional=1, expected.return=mean(returnVector), lambda=.97, digits=2) { if(prob > .5) prob <- 1 - prob retsd <- sqrt(tail(pp.exponential.smooth( returnVector^2), 1)) ans <- -qnorm(prob, mean=expected.return, sd=retsd) * notional signif(ans, digits=digits) }

where `pp.exponential.smooth`

is taken from “Exponential decay models”.

> VaRnormalExpsmo(spxret11, notional=13e6) [1] 340000

#### t distribution

The tricky bit with the t distribution is remembering that it doesn’t have 1 as its standard deviation:

VaRtExpsmo <- function(returnVector, prob=.05, notional=1, lambda=.97, df=7, digits=2) { if(prob > .5) prob <- 1 - prob retsd <- sqrt(tail(pp.exponential.smooth( returnVector^2), 1)) ans <- -qt(prob, df=df) * retsd * sqrt((df - 2)/df) * notional signif(ans, digits=digits) }

The result of this one is:

> VaRtExpsmo(spxret11, notional=13e6) 2011-12-30 340000

#### garch

There are several choices for garch estimation in R.

#### extreme value theory

There are also several choices of packages for extreme value theory. See, for instance, the Finance task view.

## Questions

What have I missed in the R world?

Are there any bugs in my functions?

**leave a comment**for the author, please follow the link and comment on their blog:

**Portfolio Probe » R language**.

R-bloggers.com offers

**daily e-mail updates**about R news and tutorials on topics such as: visualization (ggplot2, Boxplots, maps, animation), programming (RStudio, Sweave, LaTeX, SQL, Eclipse, git, hadoop, Web Scraping) statistics (regression, PCA, time series, trading) and more...