# Bernoulli distribution

Parameters $0 $k \in \{0,1\}\,$ $\begin{cases} q=(1-p) & \text{for }k=0 \\ p & \text{for }k=1 \end{cases}$ $\begin{cases} 0 & \text{for }k<0 \\ q & \text{for }0\leq k<1 \\ 1 & \text{for }k\geq 1 \end{cases}$ $p\,$ $\begin{cases} 0 & \text{if } q > p\\ 0.5 & \text{if } q=p\\ 1 & \text{if } q $\begin{cases} 0 & \text{if } q > p\\ 0, 1 & \text{if } q=p\\ 1 & \text{if } q < p \end{cases}$ $p(1-p)\,$ $\frac{q-p}{\sqrt{pq}}$ $\frac{1-6pq}{pq}$ $-q\ln(q)-p\ln(p)\,$ $q+pe^t\,$ $q+pe^{it}\,$ $q+pz\,$ $\frac{1}{p(1-p)}$

In probability theory and statistics, the Bernoulli distribution, named after Swiss scientist Jacob Bernoulli, is the probability distribution of a random variable which takes value 1 with success probability $p$ and value 0 with failure probability $q=1-p$. It can be used, for example, to represent the toss of a coin, where "1" is defined to mean "heads" and "0" is defined to mean "tails" (or vice versa).

## Properties

If $X$ is a random variable with this distribution, we have:

$\Pr(X=1) = 1 - \Pr(X=0) = 1 - q = p.\!$

A classical example of a Bernoulli experiment is a single toss of a coin. The coin might come up heads with probability $p$ and tails with probability $1-p$. The experiment is called fair if $p=0.5$, indicating the origin of the terminology in betting (the bet is fair if both possible outcomes have the same probability).

The probability mass function $f$ of this distribution is

$f(k;p) = \begin{cases} p & \text{if }k=1, \\[6pt] 1-p & \text {if }k=0.\end{cases}$

This can also be expressed as

$f(k;p) = p^k (1-p)^{1-k}\!\quad \text{for }k\in\{0,1\}.$

The expected value of a Bernoulli random variable $X$ is $E\left(X\right)=p$, and its variance is

$\textrm{Var}\left(X\right)=p\left(1-p\right).\,$

Bernoulli distribution is a special case of the Binomial distribution with $n = 1$.1

The kurtosis goes to infinity for high and low values of $p$, but for $p=1/2$ the Bernoulli distribution has a lower excess kurtosis than any other probability distribution, namely −2.

The Bernoulli distributions for $0 \le p \le 1$ form an exponential family.

The maximum likelihood estimator of $p$ based on a random sample is the sample mean.

## Related distributions

• If $X_1,\dots,X_n$ are independent, identically distributed (i.i.d.) random variables, all Bernoulli distributed with success probability p, then
$Y = \sum_{k=1}^n X_k \sim \mathrm{B}(n,p)$ (binomial distribution).

The Bernoulli distribution is simply $\mathrm{B}(1,p)$.

## Notes

1. ^ McCullagh and Nelder (1989), Section 4.2.2.

## References

• Johnson, N.L., Kotz, S., Kemp A. (1993) Univariate Discrete Distributions (2nd Edition). Wiley. ISBN 0-471-54897-9

 HPTS - Area Progetti - Edu-Soft - JavaEdu - N.Saperi - Ass.Scuola.. - TS BCTV - TS VideoRes - TSODP - TRTWE TSE-Wiki - Blog Lavoro - InterAzioni- NormaScuola - Editoriali - Job Search - DownFree !
 TerritorioScuola. Some rights reserved. Informazioni d'uso ☞