Markov chains on a measurable state space: Difference between revisions
Line 32: | Line 32: | ||
:<math>p_{n+1}(x,A) := \int_E p_n(y,A) \, p(x,dy)</math> |
:<math>p_{n+1}(x,A) := \int_E p_n(y,A) \, p(x,dy)</math> |
||
for <math>n \in \mathbb{N}, \, n \geq 1</math> and <math>p_1 := p</math>. For the associated Markov chain <math>(X_n)_{n \in \mathbb{N}}</math> according to <math>p</math> and <math>\mu</math> one obtains |
for <math>n \in \mathbb{N}, \, n \geq 1</math> and <math>p_1 := p</math>. For the associated Markov chain <math>(X_n)_{n \in \mathbb{N}}</math> according to <math>p</math> and <math>\mu</math> one obtains |
||
:<math>\mathbb{P}[X_0 \in A , \, X_n \in B ]</math> |
:<math>\mathbb{P}[X_0 \in A , \, X_n \in B ] = \int_A p_n(x,B) \, \mu(dx)</math>. |
||
==References== |
==References== |
Revision as of 12:40, 4 November 2015
This article, Markov chains on a measurable state space, has recently been created via the Articles for creation process. Please check to see if the reviewer has accidentally left this template after accepting the draft and take appropriate action as necessary.
Reviewer tools: Inform author |
A Markov chain on a measurable state space is a discrete-time Markov chain with a measurable space as state space.
History
The definition of Markov chains has evolved during the 20th century. In 1953 the term Markov chain was used for stochastic processes with discrete or continuous index set, living on a countable or finite state space, see Doob[1] or Chung[2]. Since the late 20th century it became more popular to consider a Markov chain as a stochastic process with discrete index set, living on a measurable state space[3][4][5].
Definition
Denote with a measurable space and with a Markov kernel with source and target . A stochastic process on is called a time homogeneous Markov chain with Markov kernel and start distribution if
is satisfied for any . One can construct for any Markov kernel and any probability measure an associated Markov chain[4].
Remark about Markov kernel integration
For any measure we denote for -integrable function the Lebesgue integral as . For the measure defined by we used the following notation:
Basic properties
Starting in a single point
If is a Dirac measure in , we denote for a Markov kernel with starting distribution the associated Markov chain as on and the expectation value
for a -integrable function . By definition, we have then .
We have for any measurable function the following relation [4]:
Family of Markov kernels
For a Markov kernel with starting distribution one can introduce a family of Markov kernels by
for and . For the associated Markov chain according to and one obtains
- .
References
- ^ Joseph L. Doob: Stochastic Processes. New York: John Wiley & Sons, 1953.
- ^ Kai L. Chung: Markov Chains with Stationary Transition Probabilities. Second edition. Berlin: Springer-Verlag, 1974.
- ^ Sean Meyn and Richard L. Tweedie: Markov Chains and Stochastic Stability. 2nd edition, 2009.
- ^ a b c Daniel Revuz: Markov Chains. 2nd edition, 1984.
- ^ Rick Durrett: Probability:Theory and Examples' Fourth edition, 2005.