Statistics Homeworks

The Law of Large Numbers (LLN)

The Law of Large Numbers states that as the sample size (\(n\)) increases, the sample mean (\(\bar{x}\)) converges in probability to the expected value (\(\mu\)) of the population:

\[\lim_{{n \to \infty}} P(|\bar{x} - \mu| < \epsilon) = 1\]

This means that, with a sufficiently large sample size, the sample mean becomes a reliable estimate of the population mean.

Proof Sketch:

The proof involves concepts from probability theory and converges to the application of the Chebyshev's inequality:

\[P(|\bar{x} - \mu| < \epsilon) \geq 1 - \frac{{\sigma^2}}{{n\epsilon^2}}\]

Where \(\sigma^2\) is the population variance. As \(n\) increases, the probability converges to 1.

Simulations:

Simulations can visually demonstrate the LLN. Consider a fair six-sided die. As the number of rolls increases (\(n\)), the average value of the rolls approaches the expected average of 3.5:

The blue line on the graphic represents the average value as the number of rolls increases, converging toward the expected average (red line = 3.5).

References: [1]