# Markov property

In probability theory and statistics, the term Markov property refers to the memoryless property of a stochastic process. It is named after the Russian mathematician Andrey Markov.[1]

A stochastic process has the Markov property if the conditional probability distribution of future states of the process depends only upon the present state, not on the sequence of events that preceded it. A process with this property is called a Markov process. The term strong Markov property is similar to the Markov property, except that the meaning of "present" is defined in terms of a random variable known as a stopping time. Both the terms "Markov property" and "strong Markov property" have been used in connection with a particular "memoryless" property of the exponential distribution.[2]

The term Markov assumption is used to describe a model where the Markov property is assumed to hold, such as a hidden Markov model.

A Markov random field,[3] extends this property to two or more dimensions or to random variables defined for an interconnected network of items. An example of a model for such a field is the Ising model.

A system with discrete-time processes with the Markov property is known as a Markov chain.

## Introduction

A stochastic process has the Markov property if the conditional probability distribution of future states of the process (conditional on both past and present values) depends only upon the present state; that is, given the present, the future does not depend on the past. A process with this property is said to be Markovian or a Markov process. The most famous Markov process is a Markov chain. Brownian motion is another well-known Markov process.

## Definition

Let $(\Omega,\mathcal{F},\mathbb{P})$ be a probability space with a filtration $(\mathcal{F}_s,\ s \in I)$, for some (totally ordered) index set $I$; and let $(S,\mathcal{S})$ be a measurable space. A $(S,\mathcal{S})$-valued stochastic process $X=(X_t,\ t\in I)$ adapted to the filtration is said to possess the Markov property if, for each $A \in \mathcal{S}$ and each $s,t\in I$ with $s,

$\mathbb{P}(X_t \in A |\mathcal{F}_s) = \mathbb{P}(X_t \in A| X_s).$[4]

In the case where $S$ is a discrete set with the discrete sigma algebra and $I = \mathbb{N}$, this can be reformulated as follows:

$\mathbb{P}(X_n=x_n|X_{n-1}=x_{n-1} \dots X_0=x_0)=\mathbb{P}(X_n=x_n|X_{n-1}=x_{n-1})$.

## Strong Markov property

Suppose that $X=(X_t:t\geq 0)$ is a stochastic process on a probability space $(\Omega,\mathcal{F},\mathbb{P})$ with natural filtration $\{\mathcal{F}_t\}_{t\geq 0}$. Then $X$ is said to have the strong Markov property if, for each stopping time $\tau$, conditioned on the event $\{\tau < \infty\}$, the process $X_{\tau + \cdot}$ (which maybe needs to be defined) is independent from $\mathcal{F}_{\tau}:=\{A \in \mathcal{F}: \tau \cap A \in \mathcal{F}_t ,\, \ t \geq 0\}$ and $X_{\tau + t}-X_{\tau}$ has the same distribution as $X_{t}$ for each $t \geq 0$.

The strong Markov property is a stronger property than the ordinary Markov property, since by taking the stopping time $\tau=t$, the ordinary Markov property can be deduced.

## Alternative formulations

Alternatively, the Markov property can be formulated as follows.

$\mathbb{E}[f(X_t)|\mathcal{F}_s]=\mathbb{E}[f(X_t)|\sigma(X_s)]$

for all $t\geq s\geq 0$ and $f:\mathbb{R}^n\rightarrow \mathbb{R}$ bounded and measurable.

## Applications

A very important[citation needed] application of the Markov property in a generalized form is in Markov chain Monte Carlo computations in the context of Bayesian statistics.