UP | HOME

central limit theorem

1. Statement

Let \(X_1,...,X_n\) be a collection of \(n\) random variables, where \(X_1,...,X_n\) all have mean \(\mu\) and variance \(\sigma^2\). Let \(\bar{X}_n = \frac{1}{n}\sum_{i=1}^{n}X_i\). Then, \[ \frac{\sqrt{n}}{\sigma} (\bar{X}_n - \mu) \underset{n\rightarrow\infty}{\overset{d}{\longrightarrow}} \mathcal{N}(0,1) \]

That is, \(\frac{\sqrt{n}}{\sigma}(\bar{X}_n - \mu)\) converges in distribution to \(\mathcal{N}(0,1)\) as \(n\) grows. This means that samples from \(\frac{\sqrt{n}}{\sigma}(\bar{X}_n - \mu)\) will form an approximate histogram of \(\mathcal{N}(0,1)\) and this approximation will get better as \(n\) increases.

2. some notes on standardization

Where does the \(\sqrt{n}\) come from? Well, we are trying to get to something that has unit variance.

  • An \(X_i\) has variance \(\sigma^2\).
  • \(\sum X_i\) has variance \(n\sigma^2\).
  • \(\frac{1}{n}\sum X_i\) has variance \(\frac{1}{n}\sigma^2\)
  • \(\sqrt{n}\frac{1}{n}\sum X_i\) has variance \(\sigma^2\)
  • \(\sqrt{n}\frac{1}{n\sigma}\sum X_i\) has variance \(1\)

3. delta method

Roughly: If \(Y_n - \theta\) converges in distribution to a normal random variable then \(g(Y_n) - g(\theta)\) also converges in distribution to a normal random variable provided \(g'(\theta)\) exists and \(g'(\theta) \neq 0\)

3.1. formally

Suppose \(\sqrt{n}(Y_n - \theta) \overset{d}{\rightarrow} \mathcal{N}(0,\sigma^2)\) and \(g\) is continuously differentiable at \(\theta\). Then, \[ \sqrt{n}(g(Y_n) - g(\theta)) \overset{d}{\rightarrow} \mathcal{N}(0,g'(\theta)^2\sigma^2) \]

3.2. usage

want to use central limit theorem for function of random variable. For instance, the central limit may give us the distribution for error for \(\theta\), but the parameter we are actually interested in is \(g(\theta)\).

4. Resources

5. related

Created: 2024-07-15 Mon 01:28