# Milstein method

In mathematics, the Milstein method is a technique for the approximate numerical solution of a stochastic differential equation. It is named after Grigori N. Milstein who first published it in 1974.

## Description

Consider the autonomous Itō stochastic differential equation:

$\mathrm {d} X_{t}=a(X_{t})\,\mathrm {d} t+b(X_{t})\,\mathrm {d} W_{t}$ with initial condition $X_{0}=x_{0}$ , where $W_{t}$ stands for the Wiener process, and suppose that we wish to solve this SDE on some interval of time $[0,T]$ . Then the Milstein approximation to the true solution $X$ is the Markov chain $Y$ defined as follows:

• partition the interval $[0,T]$ into $N$ equal subintervals of width $\Delta t>0$ :
$0=\tau _{0}<\tau _{1}<\dots <\tau _{N}=T{\text{ with }}\tau _{n}:=n\Delta t{\text{ and }}\Delta t={\frac {T}{N}}$ • set $Y_{0}=x_{0};$ • recursively define $Y_{n}$ for $1\leq n\leq N$ by:
$Y_{n+1}=Y_{n}+a(Y_{n})\Delta t+b(Y_{n})\Delta W_{n}+{\frac {1}{2}}b(Y_{n})b'(Y_{n})\left((\Delta W_{n})^{2}-\Delta t\right)$ where $b'$ denotes the derivative of $b(x)$ with respect to $x$ and:
$\Delta W_{n}=W_{\tau _{n+1}}-W_{\tau _{n}}$ are independent and identically distributed normal random variables with expected value zero and variance $\Delta t$ . Then $Y_{n}$ will approximate $X_{\tau _{n}}$ for $0\leq n\leq N$ , and increasing $N$ will yield a better approximation.

Note that when $b'(Y_{n})=0$ , i.e. the diffusion term does not depend on $X_{t}$ , this method is equivalent to the Euler–Maruyama method.

The Milstein scheme has both weak and strong order of convergence, $\Delta t$ , which is superior to the Euler–Maruyama method, which in turn has the same weak order of convergence, $\Delta t$ , but inferior strong order of convergence, ${\sqrt {\Delta t}}$ .

## Intuitive derivation

For this derivation, we will only look at geometric Brownian motion (GBM), the stochastic differential equation of which is given by:

$\mathrm {d} X_{t}=\mu X\mathrm {d} t+\sigma XdW_{t}$ with real constants $\mu$ and $\sigma$ . Using Itō's lemma we get:
$\mathrm {d} \ln X_{t}=\left(\mu -{\frac {1}{2}}\sigma ^{2}\right)\mathrm {d} t+\sigma \mathrm {d} W_{t}$ Thus, the solution to the GBM SDE is:

{\begin{aligned}X_{t+\Delta t}&=X_{t}\exp \left\{\int _{t}^{t+\Delta t}\left(\mu -{\frac {1}{2}}\sigma ^{2}\right)\mathrm {d} t+\int _{t}^{t+\Delta t}\sigma \mathrm {d} W_{u}\right\}\\&\approx X_{t}\left(1+\mu \Delta t-{\frac {1}{2}}\sigma ^{2}\Delta t+\sigma \Delta W_{t}+{\frac {1}{2}}\sigma ^{2}(\Delta W_{t})^{2}\right)\\&=X_{t}+a(X_{t})\Delta t+b(X_{t})\Delta W_{t}+{\frac {1}{2}}b(X_{t})b'(X_{t})((\Delta W_{t})^{2}-\Delta t)\end{aligned}} where
$a(x)=\mu x,~b(x)=\sigma x$ See numerical solution is presented above for three different trajectories. Numerical solution for the stochastic differential equation just presented, the drift is twice the diffusion coefficient.

### Computer implementation

The following Python code implements the Milstein method and uses it to solve the SDE describing the Geometric Brownian Motion defined by

${\begin{cases}dY_{t}=\mu Y\,{\mathrm {d} }t+\sigma Y\,{\mathrm {d} }W_{t}\\Y_{0}=Y_{\text{init}}\end{cases}}$ # -*- coding: utf-8 -*-
# Milstein Method

import numpy as np
import matplotlib.pyplot as plt

num_sims = 1  # One Example

# One Second and thousand grid points
t_init, t_end = 0, 1
N = 1000 # Compute 1000 grid points
dt = float(t_end - t_init) / N

## Initial Conditions
y_init = 1
μ, σ = 3, 1

# dw Random process
def dW(Δt):
"""Random sample normal distribution"""
return np.random.normal(loc=0.0, scale=np.sqrt(Δt))

# vectors to fill
ts = np.arange(t_init, t_end + dt, dt)
ys = np.zeros(N + 1)
ys = y_init

# Loop
for _ in range(num_sims):
for i in range(1, ts.size):
t = (i - 1) * dt
y = ys[i - 1]
# Milstein method
dw_ = dW(dt)
ys[i] = y + μ * dt * y + σ * y * dw_ + 0.5 * σ**2 * y * (dw_**2 - dt)
plt.plot(ts, ys)

# Plot
plt.xlabel("time (s)")
plt.grid()
h = plt.ylabel("y")
h.set_rotation(0)
plt.show()