# Probability

## Probability

# Basic Concepts

**Probability**is a measure of the likelihood of an event occurring.- It is quantitatively represented as a number between 0 and 1, inclusive.
- The probability of an
**impossible event**is 0, while the probability of a**certain event**is 1. - An
**experiment**is a process that leads to one of several possible outcomes. The set of all possible outcomes is the**sample space**. - An
**event**is a subset of the sample space.

# Probability Axioms

**Nonnegativity**: The probability of an event is always a nonnegative real number.**Normalization**: The probability of the sample space is 1.**Additivity**: The probability of the union of two mutually exclusive events is the sum of their probabilities.

# Probability Laws

**Complementary rule**: The probability of the complement of an event is 1 minus the probability of the event.**Multiplication rule**: The probability of the intersection of two events is the product of the probabilities of the events, provided the events are independent.**Addition rule**: The probability of the union of two events is the sum of their probabilities minus the probability of their intersection.

# Conditional Probability

**Conditional probability**is the probability of an event given that another event has occurred.-
The formula for conditional probability is **P(A B) = P(A ∩ B) / P(B)**, where **P(A B)** denotes the probability of event A given that event B has occurred.

# Bayes’ Theorem

**Bayes’ theorem**provides a way to revise existing predictions or theories given new or additional evidence.-
The theorem is expressed as **P(A B) = [P(B A) * P(A)] / P(B)**.

# Random Variables

**Random variables**are numerical outcomes of a random phenomenon.- A
**discrete random variable**can take on a countable number of values, while a**continuous random variable**can take on an infinite number of values within an interval.

# Probability Distributions

- A
**probability distribution**describes how probabilities are distributed over the values of a random variable. - For discrete random variables, this distribution can be described by a
**probability mass function**. For continuous random variables, it can be described by a**probability density function**.

# Expected Value and Variance

- The
**expected value**(mean) of a random variable is essentially an average of the possible outcomes, with each outcome weighted by its probability. - The
**variance**measures how far each number in the set is from the mean and thus from every other number in the set. - The
**standard deviation**is the square root of the variance and provides a measure of the average distance from the mean.

# Important Distributions

- Some important distributions for continuous random variables are the
**uniform distribution**, the**normal distribution**, and the**exponential distribution**. - Important distributions for discrete random variables include the
**binomial distribution**, the**Poisson distribution**, and the**geometric distribution**.