Probability distributions for general discrete random variables
Probability distributions for general discrete random variables
Understanding Discrete Random Variables
- Discrete Random Variables are those which can take only specific discrete values. For example, the number of heads obtained when flipping a coin multiple times.
- In contrast, Continuous Random Variables can take all values in a continuous range.
- Each specific outcome of a discrete random variable is assigned a probability, which must sum to 1 across all possible outcomes.
Probability Mass Function (PMF)
- The Probability Mass Function (PMF) is a function that gives the probability that a discrete random variable is exactly equal to some value.
- It must satisfy two conditions:
- The PMF is always non-negative i.e. P(X=x) >= 0
- The sum of all probabilities for all possible outcomes is equal to 1 i.e. ∑ P(X=x) = 1
Cumulative Distribution Function (CDF)
- The Cumulative Distribution Function (CDF) of a random variable is defined as the probability that the variable takes a value less than or equal to a certain value.
- The CDF is a non-decreasing function, i.e., if x1 <= x2 then F(x1) <= F(x2)
- For discrete random variables, CDF is a step function.
Expectation and Variance
- The Expectation or the expected value of a discrete random variable gives the long-run average value of the variable. It is calculated by summing the products of each possible value the random variable can take and its corresponding probability.
- The Variance of a discrete random variable measures the spread of the probability distribution. It is calculated by taking the average of the squared deviations from the expected value.
- The Standard Deviation is the square root of the variance, providing a measure of spread that is in the same units as the random variable itself.
Working with multiple discrete random variables
- When considering two discrete random variables together, one must consider both the Joint Probability Mass Function (joint PMF) and the Marginal Probability Mass Function (marginal PMF).
- The joint PMF gives the probability of each possible pair of outcomes, while the marginal PMF gives the probabilities for each variable considered separately.
- If the outcome of one variable does not affect the outcome of the other, the variables are said to be Independent. Here, the joint PMF is the product of the marginal PMFs.
Remember to practice plenty of past papers and mark them thoroughly for a deeper understanding. Working through examples and problems is key to understanding and mastering these concepts.