Probability Rules and Compound Events
Calculate probabilities of combined events
Life rarely presents us with simple yes-or-no situations. Will it rain AND be windy tomorrow? What is the chance of passing the test OR the makeup exam? Will both of my flights arrive on time? When you buy a lottery ticket, you need all five numbers correct - not just one.
These are compound events: situations where we care about multiple things happening (or not happening) together. The good news is that there are clear rules for calculating these probabilities. Once you understand the underlying logic, you can tackle surprisingly complex real-world problems.
Think about it this way: if you already know how to find the probability of one event, compound events just ask you to combine those probabilities in the right way. The key is knowing when to add, when to multiply, and when to adjust for overlap.
Core Concepts
What Are Compound Events?
A compound event is an event made up of two or more simpler events combined. When we ask “What is the probability of A AND B?” or “What is the probability of A OR B?” we are working with compound events.
The two fundamental ways to combine events are:
-
“And” (Intersection): Both events must occur. “I roll a die and get an even number AND a number greater than 3” means I need to roll a 4 or 6 (the only numbers that satisfy both conditions).
-
“Or” (Union): At least one event must occur. “I roll a die and get an even number OR a number greater than 3” means I could roll 2, 4, 5, or 6 (any number that satisfies at least one condition).
Notice something important: “or” in probability means “or possibly both” - not the exclusive “one or the other but not both” that we sometimes mean in everyday speech. When a probability question asks about “A or B,” it includes the case where both A and B happen.
The Addition Rule: Probability of “Or”
When you want the probability of event A happening OR event B happening (or both), you need the Addition Rule:
$$P(A \text{ or } B) = P(A) + P(B) - P(A \text{ and } B)$$
Why do we subtract $P(A \text{ and } B)$? Because when we add $P(A)$ and $P(B)$, we count the overlap twice - once when counting A and once when counting B. Subtracting the overlap corrects for this double-counting.
Think of it like counting people at a party. If 20 people are wearing hats and 15 people are wearing glasses, you cannot say 35 people are wearing hats or glasses. Some people might be wearing both! You need to subtract those who are in both groups to get the correct total.
Example in action: If you draw one card from a standard deck, what is the probability of drawing a heart OR a face card?
- $P(\text{heart}) = \frac{13}{52}$ (13 hearts in the deck)
- $P(\text{face card}) = \frac{12}{52}$ (12 face cards: J, Q, K in each of 4 suits)
- $P(\text{heart AND face card}) = \frac{3}{52}$ (J, Q, K of hearts)
$$P(\text{heart OR face card}) = \frac{13}{52} + \frac{12}{52} - \frac{3}{52} = \frac{22}{52} = \frac{11}{26}$$
If we had simply added $\frac{13}{52} + \frac{12}{52} = \frac{25}{52}$, we would have counted the Jack, Queen, and King of hearts twice.
Mutually Exclusive Events
Two events are mutually exclusive (also called disjoint) if they cannot happen at the same time. Their intersection is empty - there is no overlap.
$$\text{If } A \text{ and } B \text{ are mutually exclusive: } P(A \text{ and } B) = 0$$
Examples of mutually exclusive events:
- Rolling a 2 and rolling a 5 on a single die roll (you cannot roll both at once)
- Drawing a heart and drawing a spade from a deck (one card cannot be both)
- Getting heads and getting tails on a single coin flip
Examples of events that are NOT mutually exclusive:
- Rolling an even number and rolling a number greater than 3 (4 and 6 satisfy both)
- Drawing a red card and drawing a face card (J, Q, K of hearts and diamonds are both)
- A person being tall and a person having brown hair (someone can be both)
Addition Rule for Mutually Exclusive Events
When events are mutually exclusive, the addition rule simplifies beautifully. Since $P(A \text{ and } B) = 0$, we have:
$$P(A \text{ or } B) = P(A) + P(B)$$
No need to subtract anything - there is no overlap to correct for.
Example: What is the probability of rolling a 1 or a 6 on a standard die?
These events are mutually exclusive (you cannot roll both 1 and 6 on the same roll):
$$P(1 \text{ or } 6) = P(1) + P(6) = \frac{1}{6} + \frac{1}{6} = \frac{2}{6} = \frac{1}{3}$$
The Multiplication Rule: Probability of “And” for Independent Events
Two events are independent if the occurrence of one does not affect the probability of the other. Knowing that one event happened tells you nothing new about whether the other event will happen.
When events are independent, the Multiplication Rule says:
$$P(A \text{ and } B) = P(A) \cdot P(B)$$
Examples of independent events:
- Flipping two different coins (the first flip does not affect the second)
- Rolling two dice (each die is separate)
- Drawing a card, replacing it, shuffling, and drawing again (replacement means no effect)
- Whether it rains today and whether your favorite team wins tonight (unrelated events)
Examples of events that are NOT independent:
- Drawing two cards without replacement (the first draw changes what is left)
- Whether you study and whether you pass the exam (studying affects your chances)
- Whether a person smokes and whether they develop lung cancer (one affects the other)
Example: What is the probability of flipping two coins and getting heads on both?
The flips are independent:
$$P(\text{heads on both}) = P(\text{heads on first}) \cdot P(\text{heads on second}) = \frac{1}{2} \cdot \frac{1}{2} = \frac{1}{4}$$
Understanding Independence
Independence is about information. Two events are independent if knowing one happened does not change your probability estimate for the other.
Consider this: if I tell you “it rained yesterday,” does that change your estimate of the probability that a fair coin I flip today lands on heads? No - coins do not care about weather. These events are independent.
But if I tell you “the first card I drew from the deck was the Ace of Spades,” does that change the probability of drawing an Ace on my second draw (without replacement)? Yes - now there are only 3 Aces left among 51 cards instead of 4 among 52. These events are NOT independent.
Mathematically, events A and B are independent if and only if:
$$P(A \text{ and } B) = P(A) \cdot P(B)$$
This is both the definition and the multiplication rule for independent events.
“At Least One” Problems
Many real-world problems ask about the probability of “at least one” success occurring. For example:
- What is the probability of getting at least one head in three coin flips?
- What is the probability that at least one of three smoke detectors goes off during a fire?
- What is the probability of rolling at least one 6 in four dice rolls?
The trick with “at least one” problems is to use the complement. “At least one” is the opposite of “none.”
$$P(\text{at least one}) = 1 - P(\text{none})$$
This is almost always easier than directly calculating “at least one.” Why? Because “at least one” could mean exactly 1, exactly 2, exactly 3, and so on - many cases to add up. But “none” is just one case.
Example: What is the probability of getting at least one head in three fair coin flips?
The hard way: Calculate $P(\text{exactly 1 head}) + P(\text{exactly 2 heads}) + P(\text{exactly 3 heads})$.
The easy way: Calculate $P(\text{no heads})$ and subtract from 1.
$P(\text{no heads}) = P(\text{tails on all three flips})$
Since the flips are independent: $$P(\text{tails on all three}) = \frac{1}{2} \cdot \frac{1}{2} \cdot \frac{1}{2} = \frac{1}{8}$$
Therefore: $$P(\text{at least one head}) = 1 - \frac{1}{8} = \frac{7}{8} = 87.5%$$
Venn Diagrams and Probability
A Venn diagram is a visual tool that helps you see the relationship between events. Each event is represented by a circle (or other shape), and the overlap represents the intersection.
For two events A and B:
┌───────────────────────────────┐
│ │
│ ┌─────────┬─────────┐ │
│ │ │ │ │
│ │ A │ B │ │
│ │ only │ only │ │
│ │ │ │ │
│ │ ┌────┴────┐ │ │
│ │ │ A and │ │ │
│ │ │ B │ │ │
│ │ └────┬────┘ │ │
│ │ │ │ │
│ └─────────┴─────────┘ │
│ │
│ Neither A nor B │
└───────────────────────────────┘
The regions represent:
- A only: Events in A but not in B
- B only: Events in B but not in A
- A and B: Events in both (the intersection)
- Neither: Events in neither A nor B (outside both circles)
Venn diagrams help you:
- See whether events overlap or are mutually exclusive
- Avoid double-counting when using the addition rule
- Organize complex probability problems
- Check that your probabilities make sense
For mutually exclusive events, the circles do not overlap at all.
Notation and Terminology
| Term | Meaning | Example |
|---|---|---|
| $P(A \text{ or } B)$ | Probability of A or B (or both) | Union |
| $P(A \text{ and } B)$ | Probability of both A and B | Intersection |
| Mutually exclusive | Cannot happen together | $P(A \text{ and } B) = 0$ |
| Independent events | One does not affect the other | $P(A \text{ and } B) = P(A) \cdot P(B)$ |
| $P(A \cup B)$ | Union notation | A or B |
| $P(A \cap B)$ | Intersection notation | A and B |
| $A^c$ | Complement of A | Not A |
| Compound event | Event combining two or more events | A and B, A or B |
You may see two notations for the same concepts:
- “A or B” can be written as $A \cup B$ (union symbol)
- “A and B” can be written as $A \cap B$ (intersection symbol)
The word “union” comes from set theory - it is the set of all outcomes in A or B or both. The word “intersection” also comes from set theory - it is the set of outcomes in both A and B.
Examples
A bag contains 4 red marbles, 5 blue marbles, and 3 green marbles. If you draw one marble at random, what is the probability of drawing a red marble or a green marble?
Solution:
First, recognize that “drawing a red marble” and “drawing a green marble” are mutually exclusive events. A single marble cannot be both red and green at the same time.
Since these events are mutually exclusive, we can use the simplified addition rule:
$$P(\text{red or green}) = P(\text{red}) + P(\text{green})$$
Total marbles: $4 + 5 + 3 = 12$
$$P(\text{red}) = \frac{4}{12} = \frac{1}{3}$$
$$P(\text{green}) = \frac{3}{12} = \frac{1}{4}$$
$$P(\text{red or green}) = \frac{4}{12} + \frac{3}{12} = \frac{7}{12} \approx 0.583 = 58.3%$$
Alternatively, you can think of this as: there are 7 marbles that are either red or green out of 12 total marbles.
A fair coin is flipped and a fair die is rolled. What is the probability of getting heads AND rolling a number greater than 4?
Solution:
The coin flip and the die roll are independent events - the outcome of one does not affect the other.
For independent events: $$P(A \text{ and } B) = P(A) \cdot P(B)$$
Let us find each probability:
$$P(\text{heads}) = \frac{1}{2}$$
Numbers greater than 4 on a die are 5 and 6, so: $$P(\text{greater than 4}) = \frac{2}{6} = \frac{1}{3}$$
Therefore: $$P(\text{heads and greater than 4}) = \frac{1}{2} \cdot \frac{1}{3} = \frac{1}{6} \approx 0.167 = 16.7%$$
You can verify this by listing all outcomes: there are 12 equally likely outcomes when flipping a coin and rolling a die (H1, H2, H3, H4, H5, H6, T1, T2, T3, T4, T5, T6). Only H5 and H6 satisfy both conditions - that is 2 out of 12, or $\frac{1}{6}$.
In a class of 30 students, 18 play soccer, 12 play basketball, and 6 play both sports. If a student is selected at random, what is the probability that the student plays soccer or basketball?
Solution:
Let S = event that a student plays soccer, and B = event that a student plays basketball.
These events are NOT mutually exclusive because 6 students play both sports. We must use the general addition rule.
Given information:
- $P(S) = \frac{18}{30}$
- $P(B) = \frac{12}{30}$
- $P(S \text{ and } B) = \frac{6}{30}$
Applying the addition rule: $$P(S \text{ or } B) = P(S) + P(B) - P(S \text{ and } B)$$
$$P(S \text{ or } B) = \frac{18}{30} + \frac{12}{30} - \frac{6}{30} = \frac{24}{30} = \frac{4}{5} = 0.8 = 80%$$
Venn diagram verification:
- Soccer only: $18 - 6 = 12$ students
- Basketball only: $12 - 6 = 6$ students
- Both sports: 6 students
- Neither sport: $30 - 12 - 6 - 6 = 6$ students
Total playing at least one sport: $12 + 6 + 6 = 24$ students
$$P(S \text{ or } B) = \frac{24}{30} = \frac{4}{5}$$
The Venn diagram approach confirms our answer. Notice that 24 students play at least one sport, leaving 6 students who play neither.
A smoke detector has a 95% probability of going off when there is a fire. If a house has 3 independent smoke detectors, what is the probability that at least one detector goes off during a fire?
Solution:
This is an “at least one” problem, so we use the complement approach:
$$P(\text{at least one goes off}) = 1 - P(\text{none go off})$$
First, find the probability that a single detector fails to go off: $$P(\text{detector fails}) = 1 - 0.95 = 0.05$$
Since the detectors are independent, the probability that ALL THREE fail is: $$P(\text{all three fail}) = 0.05 \times 0.05 \times 0.05 = (0.05)^3 = 0.000125$$
Therefore: $$P(\text{at least one goes off}) = 1 - 0.000125 = 0.999875 = 99.9875%$$
Even though each individual detector has a 5% chance of failing, having three independent detectors makes the system extremely reliable. This is why redundancy (having backup systems) is so important in safety-critical applications.
Why the complement approach? If we tried to calculate this directly, we would need: $$P(\text{exactly 1}) + P(\text{exactly 2}) + P(\text{exactly 3})$$
That requires much more calculation. The complement gives us the answer in one step.
A factory produces electronic components. Each component is tested by two independent quality checks. Check A catches defects 90% of the time, and Check B catches defects 80% of the time. If a defective component goes through both checks, what is the probability that: a) Both checks catch the defect? b) At least one check catches the defect? c) The defect goes undetected?
Solution:
Let A = Check A catches the defect, and B = Check B catches the defect.
Given: $P(A) = 0.90$ and $P(B) = 0.80$. The checks are independent.
a) Probability both checks catch the defect:
Since the checks are independent: $$P(A \text{ and } B) = P(A) \cdot P(B) = 0.90 \times 0.80 = 0.72 = 72%$$
b) Probability at least one check catches the defect:
Method 1 (Addition Rule): $$P(A \text{ or } B) = P(A) + P(B) - P(A \text{ and } B)$$ $$P(A \text{ or } B) = 0.90 + 0.80 - 0.72 = 0.98 = 98%$$
Method 2 (Complement): $$P(\text{at least one catches}) = 1 - P(\text{neither catches})$$ $$P(\text{neither catches}) = P(A^c) \cdot P(B^c) = 0.10 \times 0.20 = 0.02$$ $$P(\text{at least one catches}) = 1 - 0.02 = 0.98 = 98%$$
Both methods confirm: there is a 98% chance at least one check catches the defect.
c) Probability the defect goes undetected:
This is the complement of part (b): $$P(\text{undetected}) = P(A^c \text{ and } B^c)$$
Since the checks are independent: $$P(\text{undetected}) = P(A^c) \cdot P(B^c) = (1 - 0.90)(1 - 0.80) = 0.10 \times 0.20 = 0.02 = 2%$$
Interpretation: Even though Check A alone has a 10% miss rate and Check B alone has a 20% miss rate, having both checks in series reduces the miss rate to just 2%. This demonstrates the power of redundant testing: multiple independent checks multiply their reliability.
Key Properties and Rules
The Addition Rule (General Form)
For any two events A and B: $$P(A \text{ or } B) = P(A) + P(B) - P(A \text{ and } B)$$
This works whether or not the events are mutually exclusive. Always subtract the overlap to avoid double-counting.
Addition Rule for Mutually Exclusive Events
If events A and B cannot occur together (mutually exclusive): $$P(A \text{ or } B) = P(A) + P(B)$$
Since $P(A \text{ and } B) = 0$ for mutually exclusive events, there is no overlap to subtract.
Multiplication Rule for Independent Events
If events A and B are independent (one does not affect the other): $$P(A \text{ and } B) = P(A) \cdot P(B)$$
This extends to multiple independent events: $$P(A \text{ and } B \text{ and } C) = P(A) \cdot P(B) \cdot P(C)$$
Testing for Independence
Events A and B are independent if and only if: $$P(A \text{ and } B) = P(A) \cdot P(B)$$
If the equation holds, they are independent. If not, they are dependent.
The Complement Strategy for “At Least One”
For “at least one” problems: $$P(\text{at least one success}) = 1 - P(\text{no successes})$$
When events are independent: $$P(\text{no successes in } n \text{ trials}) = [P(\text{failure})]^n$$
Important Distinctions
| Concept | Definition | Rule |
|---|---|---|
| Mutually exclusive | Cannot happen together | $P(A \text{ and } B) = 0$ |
| Independent | One does not affect the other | $P(A \text{ and } B) = P(A) \cdot P(B)$ |
Note: Mutually exclusive and independent are NOT the same thing!
- Mutually exclusive events are actually dependent (if one happens, the other cannot)
- Independent events can happen together
- If $P(A) > 0$ and $P(B) > 0$, mutually exclusive events cannot be independent
Real-World Applications
Medical Screening: Multiple Tests
Medical tests are not perfect - they can have false positives and false negatives. To increase reliability, doctors often use multiple tests.
Suppose a screening test for a disease catches 95% of cases (95% sensitivity). If doctors use two independent tests, the probability of catching the disease becomes:
$$P(\text{at least one test catches it}) = 1 - (0.05)^2 = 1 - 0.0025 = 0.9975 = 99.75%$$
This is why cancer screening programs often use multiple detection methods. The slight inconvenience and cost of additional tests dramatically reduces the chance of missing a disease.
System Reliability: Component Failure
Engineers design critical systems with redundancy. Consider an airplane with multiple engines or a hospital with backup power generators.
If a single component has a 1% chance of failure, and you have three independent backup components (any one of which is sufficient), the probability of complete system failure is:
$$P(\text{all three fail}) = (0.01)^3 = 0.000001 = 0.0001%$$
This is one in a million - far better than the 1 in 100 risk from a single component. This principle underlies the design of everything from spacecraft to nuclear power plants.
Sports: Winning Multiple Games
A baseball team has a 60% chance of winning any single game against a particular opponent. What is the probability they win a best-of-three series?
To win the series, they need to win at least 2 games. Assuming games are independent:
- Win first two: $0.60 \times 0.60 = 0.36$
- Win first, lose second, win third: $0.60 \times 0.40 \times 0.60 = 0.144$
- Lose first, win second and third: $0.40 \times 0.60 \times 0.60 = 0.144$
$$P(\text{win series}) = 0.36 + 0.144 + 0.144 = 0.648 = 64.8%$$
The better team has an even higher advantage in a series than in a single game because the series format gives them multiple chances to demonstrate their skill.
Security Systems: Multiple Layers
Modern security uses multiple independent layers. A building might have:
- Security cameras (85% detection rate for intruders)
- Motion sensors (90% detection rate)
- Security guards (75% detection rate)
If these are independent, the probability of detecting an intruder is:
$$P(\text{at least one detects}) = 1 - (0.15)(0.10)(0.25) = 1 - 0.00375 = 0.99625$$
There is a 99.6% chance at least one security measure detects the intrusion, even though none of the individual measures is close to 100% effective.
Self-Test Problems
Problem 1: A standard die is rolled. What is the probability of rolling a number that is even OR greater than 4? (Hint: These events are not mutually exclusive.)
Show Answer
Let E = rolling an even number = {2, 4, 6} Let G = rolling greater than 4 = {5, 6}
These are not mutually exclusive because 6 is in both sets.
$$P(E) = \frac{3}{6} = \frac{1}{2}$$
$$P(G) = \frac{2}{6} = \frac{1}{3}$$
$$P(E \text{ and } G) = \frac{1}{6}$$ (only 6 is both even and greater than 4)
Using the addition rule: $$P(E \text{ or } G) = \frac{1}{2} + \frac{1}{3} - \frac{1}{6} = \frac{3}{6} + \frac{2}{6} - \frac{1}{6} = \frac{4}{6} = \frac{2}{3} \approx 66.7%$$
Verification: Outcomes that are even or greater than 4: {2, 4, 5, 6} = 4 outcomes out of 6.
Problem 2: Two fair dice are rolled. What is the probability of getting a sum of 7 or a sum of 11?
Show Answer
These events are mutually exclusive - you cannot get both a sum of 7 and a sum of 11 on the same roll of two dice.
Ways to get a sum of 7: (1,6), (2,5), (3,4), (4,3), (5,2), (6,1) = 6 ways
Ways to get a sum of 11: (5,6), (6,5) = 2 ways
Total possible outcomes when rolling two dice: $6 \times 6 = 36$
$$P(7) = \frac{6}{36} = \frac{1}{6}$$
$$P(11) = \frac{2}{36} = \frac{1}{18}$$
Since mutually exclusive: $$P(7 \text{ or } 11) = P(7) + P(11) = \frac{6}{36} + \frac{2}{36} = \frac{8}{36} = \frac{2}{9} \approx 22.2%$$
Problem 3: A fair coin is flipped 4 times. What is the probability of getting at least one tails?
Show Answer
Use the complement approach:
$$P(\text{at least one tails}) = 1 - P(\text{no tails})$$
$P(\text{no tails}) = P(\text{all heads})$
Since coin flips are independent: $$P(\text{all heads}) = \frac{1}{2} \times \frac{1}{2} \times \frac{1}{2} \times \frac{1}{2} = \left(\frac{1}{2}\right)^4 = \frac{1}{16}$$
Therefore: $$P(\text{at least one tails}) = 1 - \frac{1}{16} = \frac{15}{16} = 0.9375 = 93.75%$$
Problem 4: At a company, 40% of employees have a college degree and 25% speak a second language. If these characteristics are independent, what is the probability that a randomly selected employee has a college degree AND speaks a second language?
Show Answer
Since the events are independent:
$$P(\text{degree and language}) = P(\text{degree}) \cdot P(\text{language})$$
$$P(\text{degree and language}) = 0.40 \times 0.25 = 0.10 = 10%$$
10% of employees have both a college degree and speak a second language.
Problem 5: In a survey of 100 people, 60 like coffee, 45 like tea, and 25 like both coffee and tea. What is the probability that a randomly selected person likes coffee or tea (or both)?
Show Answer
Let C = likes coffee, T = likes tea.
$$P(C) = \frac{60}{100} = 0.60$$
$$P(T) = \frac{45}{100} = 0.45$$
$$P(C \text{ and } T) = \frac{25}{100} = 0.25$$
Using the addition rule: $$P(C \text{ or } T) = P(C) + P(T) - P(C \text{ and } T)$$
$$P(C \text{ or } T) = 0.60 + 0.45 - 0.25 = 0.80 = 80%$$
Verification:
- Coffee only: $60 - 25 = 35$
- Tea only: $45 - 25 = 20$
- Both: 25
- Total who like at least one: $35 + 20 + 25 = 80$ people
Problem 6: A password system requires passing two independent security checks. Check 1 has a 98% success rate for legitimate users, and Check 2 has a 95% success rate. What is the probability that a legitimate user passes both checks?
Show Answer
Since the checks are independent:
$$P(\text{pass both}) = P(\text{pass Check 1}) \cdot P(\text{pass Check 2})$$
$$P(\text{pass both}) = 0.98 \times 0.95 = 0.931 = 93.1%$$
This means 6.9% of legitimate users will be blocked by the system. This illustrates a tradeoff in security systems: multiple checks increase security against intruders but also increase the chance of blocking legitimate users.
Problem 7: A basketball player makes 70% of her free throws. In a game, she shoots 3 free throws. What is the probability that she makes at least one?
Show Answer
Use the complement approach, assuming free throws are independent:
$$P(\text{at least one made}) = 1 - P(\text{all missed})$$
$$P(\text{miss one}) = 1 - 0.70 = 0.30$$
$$P(\text{all three missed}) = (0.30)^3 = 0.027$$
$$P(\text{at least one made}) = 1 - 0.027 = 0.973 = 97.3%$$
Even though there is a 30% chance of missing any single free throw, the probability of missing all three is quite small.
Summary
-
Compound events combine multiple events using “and” or “or.” Understanding how to calculate their probabilities is essential for real-world applications.
-
The Addition Rule for any two events: $P(A \text{ or } B) = P(A) + P(B) - P(A \text{ and } B)$. Always subtract the overlap to avoid double-counting.
-
Mutually exclusive events cannot occur together, so $P(A \text{ and } B) = 0$. For these events: $P(A \text{ or } B) = P(A) + P(B)$.
-
Independent events do not affect each other’s probabilities. For independent events: $P(A \text{ and } B) = P(A) \cdot P(B)$.
-
Mutually exclusive and independent are NOT the same. Mutually exclusive events with non-zero probabilities are actually dependent.
-
For “at least one” problems, use the complement: $P(\text{at least one}) = 1 - P(\text{none})$. This is almost always easier than direct calculation.
-
Venn diagrams provide a visual way to understand the relationships between events and avoid counting errors.
-
The multiplication rule for independent events extends to any number of events: $P(A \text{ and } B \text{ and } C \text{ and } …) = P(A) \cdot P(B) \cdot P(C) \cdot …$
-
These rules have crucial applications in medicine (multiple tests), engineering (redundant systems), sports (series outcomes), and security (multiple layers of protection).