Bernoulli distribution

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Ex. kurtosis
Fisher information

In probability theory and statistics, the Bernoulli distribution, named after Swiss mathematician Jacob Bernoulli,[1] is the probability distribution of a random variable which takes the value 1 with probability and the value 0 with probability — i.e., the probability distribution of any single experiment that asks a yes–no question; the question results in a boolean-valued outcome, a single bit of information whose value is success/yes/true/one with probability p and failure/no/false/zero with probability q. It can be used to represent a coin toss where 1 and 0 would represent "head" and "tail" (or vice versa), respectively. In particular, unfair coins would have .

The Bernoulli distribution is a special case of the Binomial distribution where a single experiment/trial is conducted (n=1). It is also a special case of the two-point distribution, for which the outcome need not be a bit, i.e., the two possible outcomes need not be 0 and 1.

Properties of the Bernoulli Distribution[edit]

If is a random variable with this distribution, we have:

The probability mass function of this distribution, over possible outcomes k, is

This can also be expressed as

or as

The Bernoulli distribution is a special case of the binomial distribution with .[2]

The kurtosis goes to infinity for high and low values of , but for the two-point distributions including the Bernoulli distribution have a lower excess kurtosis than any other probability distribution, namely −2.

The Bernoulli distributions for form an exponential family.

The maximum likelihood estimator of based on a random sample is the sample mean.


The expected value of a Bernoulli random variable is

This is due to the fact that for a Bernoulli distributed random variable with and we find


The variance of a Bernoulli distributed is

We first find

From this follows


The skewness is . When we take the standardized Bernoulli distributed random variable we find that this random variable attains with probability and attains with probability . Thus we get

Related distributions[edit]

  • If are independent, identically distributed (i.i.d.) random variables, all Bernoulli distributed with success probability p, then
(binomial distribution).

The Bernoulli distribution is simply .

See also[edit]


  1. ^ James Victor Uspensky: Introduction to Mathematical Probability, McGraw-Hill, New York 1937, page 45
  2. ^ McCullagh and Nelder (1989), Section 4.2.2.


External links[edit]