# Linear combinations of any random variables

## Linear combinations of any random variables

## Linear Combinations of Random Variables

- A
**linear combination**involves two or more random variables combined in such a way that the resulting variable is a summation of the original random variables multiplied by a constant. - Formally, if X and Y are two random variables and
`a`

and`b`

are constants, then`Z = aX + bY`

, is a linear combination of X and Y.

## Calculation of Expected Value

- The
**expected value E(Z)**of a linear combination`Z = aX + bY`

is calculated using the formula`E(Z) = aE(X) + bE(Y)`

. - This is a direct consequence of the linearity of expectation in statistics, meaning we can add and scale expectations.
- The expected value provides us with the long-run average of
`Z`

over numerous trials, giving a measure of the “average” outcome of an experiment.

## Calculation of Variance

- The
**variance**of a linear combination`Z = aX + bY`

is given by`Var(Z) = a^2Var(X) + b^2Var(Y) + 2abCov(X, Y)`

. - If the random variables
`X`

and`Y`

are independent, the covariance term`Cov(X, Y)`

in the formula equals zero and the equation simplifies to`Var(Z) = a^2Var(X) + b^2Var(Y)`

. - The variance gives us a measure of how much the values of
`Z`

spread out from their expected value.

## Application of Linear Combinations

**Linear combination**of random variables is a fundamental concept in statistics and forms the mathematical basis for numerous statistical procedures including hypothesis testing, regression analysis, and factor analysis.- It helps us to link together different random variables to create new variables which can then be explored for further relationships and interdependencies.
- The properties of these combinations, such as their expected values and variances, can shed light on the overall behaviour of a complex system.

## Key Concept: Independence of Random Variables

- Independent variables are an important idea when dealing with linear combinations of random variables. If the two variables are
**independent**, the distribution of the sum or difference of the variables can be determined by convolving their individual distributions. - Two variables are independent if knowing the outcome of one variable does not affect the probability of outcomes for the other variable.
- Independence simplifies many statistical computations and is a common assumption in many types of statistical procedures.