- Home
- /
- Statistics
- /
- Concept Map
- /
- Probability Theory
Probability Theory Concepts
5 concepts Β· Grades 9-12
This family view narrows the full statistics map to one connected cluster. Read it from left to right: earlier nodes support later ones, and dense middle sections usually mark the concepts that hold the largest share of future work together.
Use the graph to plan review, then use the full concept list below to open precise pages for definitions, examples, and related content.
Concept Dependency Graph
Concepts flow left to right, from foundational to advanced. Hover to highlight connections. Click any concept to learn more.
Connected Families
Probability Theory concepts have 13 connections to other families.
All Probability Theory Concepts
Independent Events
Two events are independent if knowing that one event happened does not change the probability of the other event.
"Independence means βno update.β If learning B happened leaves the chance of A exactly the same, then the events are independent."
Why it matters: Independence determines whether probabilities multiply directly or whether a conditional adjustment is required.
Addition Rule
The addition rule finds the probability that at least one of two events occurs. It adds the probabilities of the two events and then subtracts any overlap so the shared outcomes are not counted twice.
"If you want βA or B,β start by adding A and B. Then fix the double-counting by removing the part that belongs to both events."
Why it matters: This rule appears in probability tables, card problems, survey data, and event planning whenever overlap matters.
Multiplication Rule
The multiplication rule finds the probability that two events both occur. It multiplies the probability of the first event by the conditional probability of the second event given that the first has happened.
"For an βandβ problem, move through the events in sequence. Take the chance of the first step, then update for the second step based on what is already known."
Why it matters: This rule is essential for sequential events, without-replacement problems, and multi-step chance models.
Law of Large Numbers
The Law of Large Numbers states that as the number of independent, identically distributed trials increases, the sample average converges to the theoretical expected value (population mean). In other words, larger samples produce more reliable estimates of the true probability or average.
"Flip a coin 10 times: maybe 7 heads (70%). Flip 100 times: closer to 50%. Flip 10,000 times: very close to 50%. More trials = more reliable averages. Short-run luck evens out."
Why it matters: The Law of Large Numbers is the theoretical foundation for using samples to estimate population parameters. It is why insurance companies can set profitable premiums, casinos always win in the long run, and political polls can predict elections.
Expected Value
The expected value of a random variable is the long-run average outcome of a random process, calculated as the weighted sum of each possible outcome times its probability. It represents what you would earn or lose on average per trial if the process were repeated infinitely many times.
"If you played a game forever, expected value is your average result per play. Positive EV = profitable long-term. Negative EV = you'll lose over time. It's the mathematical way to evaluate risky decisions."
Why it matters: Expected value is the mathematical foundation of rational decision-making under uncertainty. It is used in gambling odds, insurance premium pricing, stock portfolio valuation, and game theory strategy.