Probability theory is the branch of mathematics concerned with analyzing random phenomena and modeling uncertainty. It provides a framework for quantifying and reasoning about the likelihood of events.
Basic Concepts
Probability Space
A probability space is a mathematical construct that models a random experiment. It consists of three components:
- Sample Space \( \Omega \): The set of all possible outcomes.
- Event \( E \): A subset of the sample space, representing one or more outcomes.
- Probability Measure \( P \): A function that assigns a probability to each event.
Axioms of Probability
The probability measure \( P \) must satisfy the following axioms:
- Non-negativity: \( P(E) \geq 0 \) for any event \( E \).
- Normalization: \( P(\Omega) = 1 \).
- Additivity: For any countable sequence of mutually exclusive events \( E_1, E_2, \ldots \),
$$ P\left(\bigcup_{i=1}^{\infty} E_i\right) = \sum_{i=1}^{\infty} P(E_i) $$
Conditional Probability
The conditional probability of an event \( A \) given that \( B \) has occurred is defined as:
$$ P(A|B) = \frac{P(A \cap B)}{P(B)} $$
Bayes’ Theorem
Bayes’ Theorem relates the conditional probabilities of two events:
$$ P(A|B) = \frac{P(B|A)P(A)}{P(B)} $$
Random Variables
A random variable is a function that assigns a real number to each outcome in the sample space. There are two types of random variables:
- Discrete Random Variables: Take on a countable set of values.
- Continuous Random Variables: Take on an uncountable set of values.
Probability Distributions
The probability distribution of a random variable describes how probabilities are distributed over its possible values.
Discrete Distributions
For a discrete random variable \( X \), the probability mass function (PMF) \( p(x) \) gives the probability that \( X \) takes the value \( x \):
$$ p(x) = P(X = x) $$
Continuous Distributions
For a continuous random variable \( X \), the probability density function (PDF) \( f(x) \) describes the relative likelihood of \( X \) taking on a particular value:
$$ P(a \leq X \leq b) = \int_a^b f(x) \, dx $$
Expectation and Variance
Expectation
The expectation (or mean) of a random variable \( X \) is a measure of its central tendency. For a discrete random variable:
$$ \mathbb{E}[X] = \sum_{x} x p(x) $$
For a continuous random variable:
$$ \mathbb{E}[X] = \int_{-\infty}^{\infty} x f(x) \, dx $$
Variance
The variance of a random variable \( X \) measures the spread of its values. It is defined as:
$$ \text{Var}(X) = \mathbb{E}[(X – \mathbb{E}[X])^2] $$
Using the linearity of expectation:
$$ \text{Var}(X) = \mathbb{E}[X^2] – (\mathbb{E}[X])^2 $$
Common Probability Distributions
Binomial Distribution
A discrete distribution representing the number of successes in a fixed number of independent Bernoulli trials:
$$ P(X = k) = \frac{n!}{k!(n-k)!} p^k (1-p)^{n-k} $$
Normal Distribution
A continuous distribution characterized by its mean \( \mu \) and variance \( \sigma^2 \):
$$ f(x) = \frac{1}{\sqrt{2 \pi \sigma^2}} e^{-\frac{(x – \mu)^2}{2 \sigma^2}} $$
Exponential Distribution
A continuous distribution describing the time between events in a Poisson process:
$$ f(x) = \lambda e^{-\lambda x} $$
Law of Large Numbers and Central Limit Theorem
Law of Large Numbers
The law of large numbers states that as the number of trials increases, the sample mean converges to the expected value:
$$ \frac{1}{n} \sum_{i=1}^{n} X_i \rightarrow \mathbb{E}[X] \quad \text{as} \quad n \to \infty $$
Central Limit Theorem
The central limit theorem states that the sum (or average) of a large number of independent, identically distributed random variables approaches a normal distribution, regardless of the original distribution:
$$ \frac{\sum_{i=1}^{n} X_i – n \mu}{\sigma \sqrt{n}} \rightarrow N(0, 1) \quad \text{as} \quad n \to \infty $$
Conclusion
Probability theory provides a rigorous foundation for analyzing random phenomena and making inferences about uncertain events. Its concepts and methods are essential for understanding and modeling a wide range of real-world processes.