(pmf) \(Pr(X = 1) = p = 1 - q\)
(expected value)
\[E(X) = 0 \cdot Pr(X = 0) + 1 \cdot Pr(X = 1) = p\]$X \sim Binom(n,p)$
(pmf)
\[P(k ; n, p) = P(X = k) = \binom{n}{k} p^k q^{n-k}\](expected value)
\[E(X) = \sum_{k=0}^{n}{k \binom{n}{k} p^{k}q^{n-k}} = \cdots = np\]This can be derived easily by the linearity property of expected value.
\[X_i \sim Bern(p)\\ E(X) = E(X_1 + X_2 + \cdots + X_n) = np\]where $X_i$ is indicator random variables
$X \sim Multinom(n, p_1, p_2, \dots, p_k)$
For example k-sided dice.
(pmf) \(\Pr(X_{1}=x_{1}{\text{ and }}\dots {\text{ and }}X_{k}=x_{k}) = \frac{n!}{x_1!\cdots x_k!} p_1^{x_1} \cdots p_k^{x_k}\)
(expected value)
\[\operatorname{E}(X_i) = n p_i\]$X \sim Geom(p)$
(pmf)
(validation of pmf)
\[\begin{aligned} \sum_{k=0}^{\infty}{q^kp} & = p \sum_{k=0}^{\infty}{q^k}\\ & = \frac{p}{1-q}\\ & = 1 \end{aligned}\](expected value)
\[\begin{aligned} E(X) & = \sum_{k=0}^{\infty}{kq^kp}\\ & = p \sum_{k=1}^{\infty}{kq^k} \qquad \text{(1)} \end{aligned}\]We can use the derivative of geometric series to solve this.
\[\begin{aligned} \sum_{k=0}^{\infty}{q^k} &= \frac{1}{1-q}\\ \sum_{k=1}^{\infty}{kq^{k-1}} &= \frac{1}{(1-q)^2}\\ \sum_{k=1}^{\infty}{kq^k} &= \frac{q}{p^2}\\ \end{aligned}\]By plugging this back into (1),
\[E(X) = \frac{pq}{p^2} = \frac{q}{p}\](story proof)
(pmf) \(p_{X}(k)=\Pr(X=k)={\frac { {\binom {K}{k}}{\binom {N-K}{n-k}} }{\binom {N}{n}}}\)
(expected value) \(E(X) = n{K \over N}\)
$X \sim \operatorname {NB} (r,p)$
The negative binomial distribution can be considered as generalization of geometric distribution. (In spite of the name, it’s not obviously related to the binomial distribution.)
(pmf) \({\displaystyle f(k;r,p)\equiv \Pr(X=k)={\binom {k+r-1}{k}}p^{k}(1-p)^{r}}\) (expected value) \(E(X) = {\frac {pr}{1-p}}\)
$X \sim Pois(\lambda)$
(pmf) \({\displaystyle {\frac {\lambda ^{k}e^{-\lambda }}{k!}}}\)
(derivation)