Lattice model (finance)

From Wikipedia, the free encyclopedia
Jump to: navigation, search
For other meanings, see lattice model (disambiguation)
Binomial Lattice with CRR formulae

In finance, a lattice model [1] is a technique applied to the valuation of derivatives, where, because of path dependence in the payoff, 1) a discretized model is required and 2) Monte Carlo methods fail to account for optimal decisions to terminate the derivative by early exercise.[1] For equity options, a typical example would be pricing an American option, where a decision as to option exercise is required at "all" times (any time) before and including maturity. A continuous model, on the other hand, such as Black Scholes, would only allow for the valuation of European options, where exercise is on the option's maturity date. For interest rate derivatives lattices are additionally useful in that they address many of the issues encountered with continuous models, such as pull to par.[2]

Equity and commodity derivatives[edit]

Tree-based equity option valuation:

1. Construct the tree of equity-prices:

  • Either forward-construct, applying an up or down factor (u or d) to the current price, such that in the next period the price will either be S_{up} = S \cdot u or S_{down} = S \cdot d;
  • or given that the tree is recombining, directly via S_n = S_0 \times u ^{N_u - N_d}, where N_u is the number of up ticks and N_d is the number of down ticks.

2. Construct the corresponding option tree:

  • at each final node of the tree — i.e. at expiration of the option — the option value is simply its intrinsic, or exercise, value;
  • at earlier nodes, value is via expectation, C_{t-\Delta t,i} = e^{-r \Delta t}(pC_{t,i+1} + (1-p)C_{t,i-1}) \,, p being the probability of an up move; where non-european value is the greater of this and the exercise value given the corresponding equity value.

In general the approach is to divide time between now and the option's expiration into N discrete periods. At the specific time n, the model has a finite number of outcomes at time n + 1 such that every possible change in the state of the world between n and n + 1 is captured in a branch. This process is iterated until every possible path between n = 0 and n = N is mapped. Probabilities are then estimated for every n to n + 1 path. The outcomes and probabilities flow backwards through the tree until a fair value of the option today is calculated.

For equity and commodities the application is as follows. The first step is to trace the evolution of the option's key underlying variable(s), starting with today's spot price, such that this process is consistent with its volatility; log-normal Brownian motion with constant volatility is usually assumed.[2] The next step is to value the option recursively, stepping backwards from the final time-step, and applying risk neutral valuation at each node, where option value is the probability-weighted present value of the up- and down-nodes in the later time-step. See Binomial options pricing model#Method for more detail, as well as Rational pricing#Risk neutral valuation for logic and formulae derivation.

As above, the lattice approach is particularly useful in valuing American options, where the choice whether to exercise the option early, or to hold the option, may be modeled at each discrete time/price combination; this is true also for Bermudan options. For similar reasons, real options and employee stock options are often modeled using a lattice framework, though with modified assumptions. In each of these cases, a third step is to determine whether the option is to be exercised or held, and to then apply this value at the node in question. Some exotic options, such as barrier options, are also easily modeled here; note though that for other Path-Dependent Options, simulation would be preferred.

The simplest lattice model is the binomial options pricing model,[3]; the standard ("canonical" [4]) method is that proposed by Cox, Ross and Rubinstein (CRR) in 1979; see diagram for formulae. Over 20 other methods have been developed,[5] with each "derived under a variety of assumptions" as regards the development of the underlying's price. [6] In the limit, as the number of time-steps increases, these converge to the Log-normal distribution, and hence produce the "same" option price as Black-Scholes: to achieve this, these will variously seek to agree with the underlying's central moments, raw moments and / or log-moments at each time-step, as measured discretely. Further enhancements are designed to achieve stability relative to Black-Scholes as the number of time-steps changes. More recent models, in fact, are designed around direct convergence to Black-Scholes.[7]

A variant on the Binomial, is the Trinomial tree, [8] [9], developed by Phelim Boyle in 1986, where valuation is based on the value of the option at the up-, down- and middle-nodes in the later time-step. As for the binomial, a similar (although smaller) range of methods exist. Note that the trinomial model is considered [10] to produce more accurate results than the binomial model when fewer time steps are modelled, and is therefore used when computational speed or resources may be an issue. For vanilla options, as the number of steps increases, the results rapidly converge, and the binomial model is then preferred due to its simpler implementation. For exotic options the trinomial model (or adaptations) is sometimes more stable and accurate, regardless of step-size.

When it is important to incorporate the volatility smile, or surface, Implied trees can be constructed. Here, the tree is solved such that it successfully reproduces selected (all) market prices, across various strikes and expirations; see local volatility. These trees thus "ensure that all European standard options (with strikes and maturities coinciding with the tree nodes) will have theoretical values which match their market prices." [11] Using the calibrated lattice one can then price options with strike / maturity combinations not quoted in the market, such that these prices are consistent with observed volatility patterns. There exist both Implied binomial trees, often Rubinstein IBTs (R-IBT)[12], and Implied trinomial trees, often Derman-Kani-Chriss [13] (DKC; superseding the Derman-Kani IBT [14]). The former is easier built, but is consistent with one maturity only; the latter will be consistent with, but at the same time requires, known (or interpolated) prices at all time-steps.

As regards the construction, for an R-IBT the first step is to recover the “Implied Ending Risk-Neutral Probabilities” of spot prices. Then by the assumption that all paths which lead to the same ending node have the same risk-neutral probability, a “path probability” is attached to each ending node. Thereafter “it's as simple as One-Two-Three”, and a three step backwards recursion allows for the node probabilities to be recovered for each time step. Option valuation then proceeds as standard. For DKC, the first step is to recover the state prices corresponding to each node in the tree, such that these are consistent with the observed volatility surface. Thereafter the up-, down- and middle-probabilities are found for each node such that: these sum to 1; spot prices adjacent time-step-wise evolve risk neutrally, incorporating dividend yield; state prices similarly “grow” at the risk free rate.[15] (The solution here is iterative per time step as opposed to simultaneous.) As for R-IBTs option valuation is then by standard backward recursion.

As an alternative, Edgeworth binomial trees [16] allow for an analyst-specified skew and kurtosis in spot price returns; see Edgeworth series. This approach is useful when the underlying's behavior departs (markedly) from normality. A related use is to calibrate the tree to the volatility smile (or surface), by a "judicious choice" [17] of parameter values — priced here, options with differing strikes will return differing implied volatilities. For pricing American options, an Edgeworth-generated ending distribution may be combined with an R-IBT. Note that this approach is limited as to the set of skewness and kurtosis pairs for which valid distributions are available. One recent proposal, Johnson binomial trees, is to use Johnson's system of distributions, as this is capable of accommodating all possible pairs; see Johnson SU distribution.

For multiple underlyers multinomial lattices [18][19] can be built, although the number of nodes increases exponentially with the number of underlyers. As an alternative, Basket options, for example, can be priced using an "approximating distribution" [20] via an Edgeworth (or Johnson) tree.

Interest rate derivatives[edit]

Tree-based bond option valuation:

0. Construct an interest-rate tree, which, as described in the text, will be consistent with the current term structure of interest rates.

1. Construct a corresponding tree of bond-prices, where the underlying bond is valued at each node by "backwards induction":

  • at its final nodes, bond value is simply face value (or $1), plus coupon (in cents) if relevant, discounted to the start of the time-step using the corresponding short-rate;
  • at each earlier node, it is the discounted expected value of nodes in the later time step, plus coupon payments during the current time step, similarly discounted to the start of the time-step.

2. Construct a corresponding bond-option tree, where the option on the bond is valued similarly:

  • at option maturity, value is based on moneyness for all nodes in that time-step;
  • at earlier nodes, value is a function of the short-rate-discounted expected value of the option at the nodes in the later time step; where non-european value is the greater of this and the exercise value given the corresponding bond value.

Lattices are commonly used in valuing Bond options, Swaptions, and other interest rate derivatives [21][22] In these cases the valuation is largely as above, but requires an additional, zeroeth, step of constructing an interest rate tree, on which the value of the underlying is then based. Note that the underlying here is valued via "backward induction" ie flows backwards from maturity, incorporating scheduled cash flows at each node, as opposed to forwards from valuation date as above. The next step, option valuation, then proceeds as standard. See aside.

The initial lattice is built by discretizing either a short-rate model, such as Hull-White or Black Derman Toy, or a forward rate-based model, such as the LIBOR market model or HJM. As for equity, trinomial trees may also be employed for these models;[23] this is usually the case for Hull-White trees. For the forward rate-based models, dependent on volatility assumptions, the lattice might not recombine.[24][25] This means that an "up-move" followed by a "down-move" will not give the same result as a "down-move" followed by an "up-move". In this case, the Lattice is sometimes referred to as a bush, and the number of nodes grows exponentially as a function of number of time-steps.

Under HJM,[26] the condition of no arbitrage implies that there exists a martingale probability measure, as well as a corresponding restriction on the “drift coefficients” of the forward rates. These, in turn, are functions of the volatility(s) of the forward rates. [27] A "simple" discretized expression [28] for the drift then allows for forward rates to be expressed in a binomial lattice. A recombining binomial tree methodology is also available for the Libor Market Model.[29]

As regards the short-rate models, these are, in turn, further categorized: these will be either equilibrium-based (Vasicek and CIR) or arbitrage-free (Ho–Lee and subsequent). This distinction means that for equilibrium-based models the yield curve is an output from the model, while for arbitrage-free models the yield curve is an input to the model.[30] In the former case, the approach is to “calibrate” the model parameters, such that bond prices produced by the model, in its continuous form, best fit observed market prices. [31] The tree is then built as a function of these parameters. In the latter case, the calibration is directly on the lattice: the fit is to both the current term structure of interest rates (i.e. the yield curve), and the corresponding volatility structure. Here, calibration means that the interest-rate-tree reproduces the prices of the zero-coupon bonds — and any other interest-rate sensitive securities — used in constructing the yield curve; note the parallel to implied trees above, and compare Bootstrapping (finance). For models assuming a normal distribution (such as Ho-Lee), calibration may be performed analytically, while for log-normal models the calibration is via a root-finding algorithm; see boxed-description under Black–Derman–Toy model.

The volatility structure — i.e. vertical node-spacing — here reflects the volatility of rates during the quarter, or other period, corresponding to the lattice time-step. (Some analysts use "realized volatility", i.e. of the rates applicable historically for the time-step; others prefer to use current interest rate cap prices, and the implied volatility for the Black-76-prices of each component caplet; see Interest rate cap#Implied Volatilities.) Given this functional link to volatility, note the resultant difference in the construction relative to implied trees: here, the volatility is known for each time-step, and the node-values must be solved for specified risk neutral probabilities; for implied trees, on the other hand, a single volatility cannot be specified per time-step, i.e. we have a "smile", and the tree is built by solving for the probabilities corresponding to specified values of the underlying at each node.

Once calibrated, the interest rate lattice is then used in the valuation of various of the fixed income instruments and derivatives.[32] The approach for bond options is described aside — note that this approach addresses the problem of pull to par experienced under closed form approaches; see Black–Scholes model#Valuing bond options. For swaptions the logic is almost identical, substituting swaps for bonds in step 1, and swaptions for bond options in step 2. For caps (and floors) step 1 and 2 are combined: at each node the value is based on the relevant nodes at the later step, plus, for any caplet (floorlet) maturing in the time-step, the difference between its reference-rate and the short-rate at the node (and reflecting the corresponding day-count fraction and notional-value exchanged). For callable- and putable bonds a third step would be required: at each node in the time-step incorporate the effect of the embedded option on the bond price and / or the option price there before stepping-backwards one time-step. (And noting that these options are not mutually exclusive, and so a bond may have several options embedded; [33] Hybrid Securities are treated below.) For other, more exotic interest rate derivatives, similar adjustments are made to steps 1 and onward.

An alternative approach to modeling (American) bond options, particularly those struck on yield to maturity (YTM), employs modified equity-lattice methods. [34] Here the analyst builds a CRR tree of YTM, applying a constant volatility assumption, and then calculates the bond price as a function of this yield at each node; prices here thus pulling-to-par. The second step is to incorporate any term structure of volatility by building a corresponding DKC tree (based on every second time-step in the CRR tree: as DKC is trinomial whereas CRR is binomial) and then using this for option valuation.

Hybrid Securities[edit]

Hybrid securities, incorporating both equity- and bond-like features are also valued using trees. [35], [36].

For Convertible bonds (CBs) the approach of Tsiveriotis and Fernandes (1998) [37] is to divide the value of the bond at each node into an “equity" component, arising from situations where the CB will be converted, and a “debt" component, arising from situations where CB is redeemed. Correspondingly, twin trees are constructed where discounting is at the risk free and credit risk adjusted rate respectively, with the sum being the value of the CB. [38] An alternate approach, originally published by Goldman Sachs (1994) [39], does not decouple the components, rather, discounting is at a conversion-probability-weighted risk-free and risky interest rate within a single tree. See Convertible bond#Valuation, Contingent convertible bond.

More generally, equity can be viewed as a call option on the firm:[40] where the value of the firm is less than the value of the outstanding debt shareholders would choose not to repay the firm’s debt; they would choose to repay - and not to liquidate (i.e. exercise their option) - otherwise. Lattice models have been developed for equity analysis here,[41][42] particularly as relates to distressed firms. [43] Relatedly, as regards corporate debt pricing, the relationship between equity holders' limited liability and potential Chapter 11 proceedings has also been modelled via lattice. [44]


  1. ^ Cox, J. C., Ross, S. A., & Rubinstein, M. (1979). Option pricing: A simplified approach. Journal of financial Economics, 7(3), 229-263.
  2. ^ Hull, J. C. (2006). Options, futures, and other derivatives. Pearson Education India.