# What is GARCH?

**Short answer**

GARCH stands for Generalized Auto Regressive Conditional Heteroscedasticity. This is an econometric model used for modelling and forecasting time-dependent variance, and hence volatility, of stock price returns. It represents current variance in terms of past variance(s).

**Example**

The simplest member of the GARCH family is GARCH(1,1) in which the variance, *vn,* of stock returns at time step *n* is modelled by

where wo is the long-term variance, *a* and *f* are positive parameters, with *a + ff<* 1, and *Bn* are independent Brownian motions, that is, random numbers drawn from a normal distribution. The latest variance, *vn,* can therefore be thought of as a weighted average of the most recent variance, the latest square of returns, and the long-term average.

**Long answer**

## What?

GARCH is one member of a large family of econometric models used to model time-varying variance. They are popular in quantitative finance because they can be used for measuring and forecasting volatility.

It is clear from simple equity or index data that volatility is not constant. If it were then estimating it would be very simple. After all, in finance we have what sometimes seems like limitless quantities of data. Since volatility varies with time we would like at the very least to know what it is *right now.* And, more ambitiously, we would like to know what it is going to be in the future, if not precisely then perhaps know its future expected value. This requires a model.

The simplest popular model assumes that we can get an estimate for volatility over the next *N* days (the future) by looking at volatility over the previous *N* days (the past). This moving window volatility is initially appealing but suffers from the problem that if there was a one-off jump in the stock price it will remain in the data with the same weight for the next *N* days and then suddenly drop out. This leads to artificially inflated volatility estimates for a while. One way around this is to use the second most popular volatility model, the exponentially weighted moving average (EWMA). This takes the form

where { is a parameter between 0 and 1, and the *R's* are the returns, suitably normalized with the time step. This models the latest variance as a weighted average between the previous variance and the latest square of returns. The larger { the more weight is attached to the distant past and the less to the recent past. This model is also simple and appealing, but it has one drawback. It results in no term structure going into the future. The expected variance tomorrow, the day after, and every day in the future is just today s variance This is counterintuitive, especially at times when volatility is at historical highs or lows.

And so we consider the third simplest model,

the GARCH(1,1) model. This adds a constant, long-term variance, to the EWMA model. The expected variance, *k* time steps in the future, then behaves like

Since *a* + *{<* 1 this is exponentially decay of the average to its mean. A much nicer, more realistic, time dependence than we get from the EWMA model.

In GARCH(p, q)the(p, q) refers to there being *p* past variances and *q* past returns in the estimate:

## Why?

Volatility is a required input for all classical option-pricing models, it is also an input for many asset-allocation problems and risk estimation, such as Value at Risk. Therefore it is very important to have a method for forecasting future volatility.

There is one slight problem with these econometric models, however. The econometrician develops his volatility models in discrete time, whereas the option-pricing quant would ideally like a continuous-time stochastic differential equation model. Fortunately, in many cases the discrete-time model can be reinterpreted as a continuous-time model (there is *weak convergence* as the time step gets smaller), and so both the econometrician and the quant are happy. Still, of course, the econometric models, being based on real stock price data, result in a model for the *real* and not the *risk-neutral* volatility process. To go from one to the other requires knowledge of the market price of volatility risk.

## How?

The parameters in these models are usually determined by Maximum Likelihood Estimation applied to the (log)likelihood function. Although this technique is usually quite straightforward to apply there can be difficulties in practice. These difficulties can be associated with

• having insufficient data

• the (log)likelihood function being very 'flat' with respect to the parameters, so that the maximum is insensitive to the parameter values

• estimating the wrong model, including having too many parameters (the best model may be simpler than you think).

## Family members

Here are *some* of the other members of the GARCH family. New ones are being added all the time, they are breeding like rabbits. In these models the 'shocks' can typically either have a normal distribution, a Student s t-distribution or a Generalized Error distribution, the latter two having the fatter tails.

**NGARCH**

This is similar to GARCH(1,1) but the parameter *y* permits correlation between the stock and volatility processes.

* AGARCH* Absolute value GARCH. Similar to GARCH but with

the volatility (not the variance) being linear in the absolute value of returns (instead of square of returns).

* EGARCH* Exponential GARCH. This models the logarithm of the variance. The model also accommodates asymmetry in that negative shocks can have a bigger impact on volatility than positive shocks.

* REGARCH* Range-based Exponential GARCH. This models the low to high range of asset prices over a 'day.

* IGARCH* Integrated GARCH. This is a type of GARCH model with further constraints on the parameters.

* FIGARCH* Fractionally Integrated GARCH. This model uses the fractional differencing lag operator applied to the variance. This adds an extra parameter to the GARCH model, and is such that it includes GARCH and IGARCH as extremes. This model has the long memory, slow decay of volatility as seen in practice.

* FIEGARCH* Fractionally Integrated Exponential GARCH. This models the logarithm of variance and again has the long memory, slow decay of volatility as seen in practice.

* TGARCH* Threshold GARCH. This is similar to GARCH but

includes an extra term that kicks in when the shock is negative. This gives a realistic asymmetry to the volatility model.

* PGARCH* Power GARCH. In this model the variance is raised to a power other than zero (logarithm), one (AGARCH) or two. This model can have the long memory, slow decay of volatility seen in practice.

* CGARCH* Component GARCH. This models variance as the sum of two or more 'components.' In a two-component model, for example, one component is used to capture short-term and another the long-term effects of shocks. This model therefore has the long memory, slow decay of volatility seen in practice.

**References and Further Reading**

Bollerslev, T 1986 Generalised Autoregressive Conditional Heteroskedasticity. *Journal of Econometrics* 31 307-327

Engle, R 1982 Autoregressive Conditional Heteroskedasticity with estimates of the variance of United Kingdom inflation. *Econometrica* 5 987-1008