# Continuous random variables

Understanding Continuous Random Variables

• A continuous random variable can take any value in a certain interval or range.
• Unlike discrete random variables that can only take specific values, continuous random variables can take an infinite number of different values.
• Examples of things that could be modelled using continuous random variables include (but are not limited to) time, distance, temperature.

Probability Density Function

• A continuous random variable is described by a probability density function (pdf).
• The total area under the curve of the probability density function equals 1 as it represents all possible outcomes.
• The probability that a continuous random variable takes a specific value is always zero because there are infinite possibilities within the range.
• The area under the curve of the pdf for a given range of values is equal to the probability that the variable will be within that range.

Cumulative Distribution Function

• The cumulative distribution function (CDF) for a continuous random variable gives the probability that the variable is less than or equal to a certain value.
• The CDF is the integral of the pdf from negative infinity to the specific value.

Expected Value and Variance

• The expected value (E[X]) of a continuous random variable can be calculated as the integral of x times the pdf over the range of possible values.
• The variance (Var[X]) can be calculated as the expected value of (X - E[X])^2. Alternatively, it can be calculated as E[X^2] - (E[X])^2.
• The standard deviation is the square root of the variance.

Working with the Normal Distribution

• The Normal Distribution is a specific type of continuous random variable with a bell-shaped probability density function.
• It is defined by two parameters, the mean (μ) and variance (σ^2).
• The area under the curve of the pdf for a normal distribution can be calculated using normal tables or computer software.
• Standardising a normal variable (transforming it to a standard normal variable with mean 0 and variance 1) can make calculations easier, especially when consulting tables or using software. This is done using the formula Z = (X - μ) / σ.

Linear transformation

• When a random variable X undergoes linear transformation, e.g., Y=aX+b, the expected value also transforms linearly E[Y] = aE[X] + b.
• The variance has scaling property, i.e., if Y=aX, then Var[Y] = a^2 Var[X].

Remember, understanding concepts can be more beneficial than only memorising formulas!