# Markov blanket In a Bayesian network, the Markov blanket of node A includes its parents, children and the other parents of all of its children.

In statistics and machine learning, the Markov blanket for a node in a graphical model contains all the variables that shield the node from the rest of the network. This means that the Markov blanket of a node is the only knowledge needed to predict the behavior of that node and its children. The term was coined by Judea Pearl in 1988.

In a Bayesian network, the values of the parents and children of a node evidently give information about that node. However, its children's parents also have to be included, because they can be used to explain away the node in question. In a Markov random field, the Markov blanket for a node is simply its adjacent (or neighboring) nodes.

The Markov blanket for a node $A$ in a Bayesian network, denoted here by $\operatorname {MB} (A)$ , is the set of nodes composed of $A$ 's parents, $A$ 's children, and $A$ 's children's other parents.

Every set of nodes in the network is conditionally independent of $A$ when conditioned on the set $\operatorname {MB} (A)$ , that is, when conditioned on the Markov blanket of the node $A$ : in other words, given the nodes in $\operatorname {MB} (A)$ , A is conditionally independent of the other nodes in the graph. Formally, this property can be written, for distinct nodes $A$ and $B$ , as follows

$\Pr(A\mid \operatorname {MB} (A),B)=\Pr(A\mid \operatorname {MB} (A)).\!$ 