Law of Large Numbers

Probability Theory
theorem

Grade 9-12

View on concept map

The Law of Large Numbers states that as the number of independent, identically distributed trials increases, the sample average converges to the theoretical expected value (population mean). The Law of Large Numbers is the theoretical foundation for using samples to estimate population parameters.

Definition

The Law of Large Numbers states that as the number of independent, identically distributed trials increases, the sample average converges to the theoretical expected value (population mean). In other words, larger samples produce more reliable estimates of the true probability or average.

๐Ÿ’ก Intuition

Flip a coin 10 times: maybe 7 heads (70%). Flip 100 times: closer to 50%. Flip 10,000 times: very close to 50%. More trials = more reliable averages. Short-run luck evens out.

๐ŸŽฏ Core Idea

As the number of independent trials grows, the sample average converges to the theoretical expected value. Short-run results are unpredictable; long-run averages are stable.

Example

Casino edge: In 100 bets, you might win. In 1 million bets, the casino's 2% edge guarantees they profit.

๐ŸŒŸ Why It Matters

The Law of Large Numbers is the theoretical foundation for using samples to estimate population parameters. It is why insurance companies can set profitable premiums, casinos always win in the long run, and political polls can predict elections.

๐Ÿ’ญ Hint When Stuck

To see the Law of Large Numbers in action, simulate coin flips. After 10 flips, the proportion of heads may be far from 0.5. After 100 flips, it will be closer. After 10,000 flips, it will be very close to 0.5. The key insight is that the average stabilizes as the sample grows, but individual outcomes remain random.

Formal View

Let X_1, X_2, \ldots be i.i.d. random variables with E[X_i] = \mu and finite variance. The weak law states \bar{X}_n = \frac{1}{n}\sum_{i=1}^{n} X_i \xrightarrow{P} \mu as n \to \infty, meaning P(|\bar{X}_n - \mu| > \varepsilon) \to 0 for all \varepsilon > 0.

Compare With Similar Concepts

๐Ÿšง Common Stuck Point

Students commit the gambler's fallacy โ€” thinking that after several tails, heads is 'due.' Each flip is independent; past outcomes do not change future probabilities.

โš ๏ธ Common Mistakes

  • Gambler's fallacy (thinking short-run must 'balance')
  • Applying to single events
  • Expecting exact convergence

Frequently Asked Questions

What is Law of Large Numbers in Statistics?

The Law of Large Numbers states that as the number of independent, identically distributed trials increases, the sample average converges to the theoretical expected value (population mean). In other words, larger samples produce more reliable estimates of the true probability or average.

When do you use Law of Large Numbers?

To see the Law of Large Numbers in action, simulate coin flips. After 10 flips, the proportion of heads may be far from 0.5. After 100 flips, it will be closer. After 10,000 flips, it will be very close to 0.5. The key insight is that the average stabilizes as the sample grows, but individual outcomes remain random.

What do students usually get wrong about Law of Large Numbers?

Students commit the gambler's fallacy โ€” thinking that after several tails, heads is 'due.' Each flip is independent; past outcomes do not change future probabilities.

How Law of Large Numbers Connects to Other Ideas

To understand law of large numbers, you should first be comfortable with probability basic and mean fair share. Once you have a solid grasp of law of large numbers, you can move on to central limit theorem and confidence interval.