The Central Limit Theorem (CLT)
Weak law of large numbers
Markov inequality
When X>0 and E[X] is small, then X is unlikely to be very large
$$
X>0 \land a > 0, P(X \ge a ) \le \frac{E[X]}{a}
$$
Deduction
$$
P(X\ge a) = \int_a^\infty f_X(x) dx
$$
$$
E[X] = \int_0^\infty x f_X(x) dx \ge \int_a^\infty x f_X(x) dx \ge \int_a^\infty a f_X(x) dx
$$
$$
\therefore E[X] \ge a P(X \ge a)
$$
Chebyshev Inequality
An application/improved Markov inequality
for a r.v. X , if its variance is small, it will be unlikely to be far away from its expectation
$$
P(|X-E[X]| \ge C) \le \frac{\sigma^2}{C^2}, C \text{ is a const}
$$
Deduction
using Markov inequality
$$
P(|X-\mu| \le C) = P((X-\mu)^2 \le C^2) \le \frac{E[(X-\mu)^2]}{C^2}
$$
$$
\therefore P(|X-E[X]| \ge C) \le \frac{\sigma^2}{C^2}
$$
Weak Law of Large Number
For finite many independent identical distributed r.v.
$$
M = \frac{X_1 + X_2 + … + X_n}{n}
$$
$$
E[M] = \frac{E[X_1] + E[X_2] + … + E[X_n]}{n} = \frac{n\mu}{n} = \mu
$$
$$
var(M) = \frac{var(X_1) + var(X_2) + … + var(X_n) }{n^2} = \frac{\sigma^2}{n}
$$
CLT
Why gaussian distribution is called the distribution of god?