# Order of integration

(Redirected from I(1))

In statistics, the order of integration, denoted I(d), of a time series is a summary statistic, which reports the minimum number of differences required to obtain a covariance stationary series.

## Integration of order zero

A time series is integrated of order 0 if it admits a moving average representation with

${\displaystyle \sum _{k=0}^{\infty }\mid {b_{k}}\mid ^{2}<\infty ,}$

where ${\displaystyle b}$ is the possibly infinite vector of moving average weights (coefficients or parameters). This implies that the autocovariance is decaying to 0 sufficiently quickly. This is a necessary, but not sufficient condition for a stationary process. Therefore, all stationary processes are I(0), but not all I(0) processes are stationary.

## Integration of order d

A time series is integrated of order d if

${\displaystyle (1-L)^{d}X_{t}\ }$

is a stationary process, where ${\displaystyle L}$ is the lag operator and ${\displaystyle 1-L}$ is the first difference, i.e.

${\displaystyle (1-L)X_{t}=X_{t}-X_{t-1}=\Delta X.}$

In other words, a process is integrated to order d if taking repeated differences d times yields a stationary process.

## Constructing an integrated series

An I(d) process can be constructed by summing an I(d − 1) process:

• Suppose ${\displaystyle X_{t}}$ is I(d − 1)
• Now construct a series ${\displaystyle Z_{t}=\sum _{k=0}^{t}X_{k}}$
• Show that Z is I(d) by observing its first-differences are I(d − 1):
${\displaystyle \triangle Z_{t}=X_{t},}$
where
${\displaystyle X_{t}\sim I(d-1).\,}$