Bernoulli distribution

From Wikipedia, the free encyclopedia
  (Redirected from Bernoulli random variable)
Jump to: navigation, search
Bernoulli
Parameters 0<p<1, p\in\R
Support k \in \{0,1\}\,
pmf 
    \begin{cases}
    q=(1-p) & \text{for }k=0 \\ p & \text{for }k=1
    \end{cases}
CDF 
    \begin{cases}
    0 & \text{for }k<0 \\ q & \text{for }0\leq k<1 \\ 1 & \text{for }k\geq 1
    \end{cases}
Mean  p\,
Median \begin{cases}
0 & \text{if } q > p\\
0.5 & \text{if } q=p\\
1 & \text{if } q<p
\end{cases}
Mode \begin{cases}
0 & \text{if } q > p\\
0, 1 & \text{if } q=p\\
1 & \text{if } q < p
\end{cases}
Variance p(1-p)\,
Skewness \frac{q-p}{\sqrt{pq}}
Ex. kurtosis \frac{1-6pq}{pq}
Entropy -q\ln(q)-p\ln(p)\,
MGF q+pe^t\,
CF q+pe^{it}\,
PGF q+pz\,
Fisher information  \frac{1}{p(1-p)}

In probability theory and statistics, the Bernoulli distribution, named after Swiss scientist Jacob Bernoulli, is the probability distribution of a random variable which takes value 1 with success probability p and value 0 with failure probability q=1-p. It can be used, for example, to represent the toss of a coin, where "1" is defined to mean "heads" and "0" is defined to mean "tails" (or vice versa).

Properties[edit]

If X is a random variable with this distribution, we have:

 \Pr(X=1) = 1 - \Pr(X=0) = 1 - q = p.\!

A classical example of a Bernoulli experiment is a single toss of a coin. The coin might come up heads with probability p and tails with probability 1-p. The experiment is called fair if p=0.5, indicating the origin of the terminology in betting (the bet is fair if both possible outcomes have the same probability).

The probability mass function f of this distribution is

 f(k;p) = \begin{cases} p & \text{if }k=1, \\[6pt]
1-p & \text {if }k=0.\end{cases}

This can also be expressed as

f(k;p) = p^k (1-p)^{1-k}\!\quad \text{for }k\in\{0,1\}.

The expected value of a Bernoulli random variable X is E\left(X\right)=p, and its variance is

\textrm{Var}\left(X\right)=p\left(1-p\right).

Bernoulli distribution is a special case of the Binomial distribution with n = 1.[1]

The kurtosis goes to infinity for high and low values of p, but for p=1/2 the Bernoulli distribution has a lower excess kurtosis than any other probability distribution, namely −2.

The Bernoulli distributions for 0 \le p \le 1 form an exponential family.

The maximum likelihood estimator of p based on a random sample is the sample mean.

Related distributions[edit]

  • If X_1,\dots,X_n are independent, identically distributed (i.i.d.) random variables, all Bernoulli distributed with success probability p, then
Y = \sum_{k=1}^n X_k \sim \mathrm{B}(n,p) (binomial distribution).

The Bernoulli distribution is simply \mathrm{B}(1,p).

See also[edit]

Notes[edit]

  1. ^ McCullagh and Nelder (1989), Section 4.2.2.

References[edit]

  • Johnson, N.L., Kotz, S., Kemp A. (1993) Univariate Discrete Distributions (2nd Edition). Wiley. ISBN 0-471-54897-9

External links[edit]