Invariance
Invariance in Matrices
- The term invariance in context of matrices describes a property that remains unchanged under certain transformations or operations.
Linear Transformations and Invariance
- A linear transformation is a function between two vectors spaces that preserves the operations of vector addition and scalar multiplication.
- A vector space is said to be invariant under a particular transformation if applying the transformation to any vector in the space results in a vector that also belongs to the space.
Types of Invariance
- Column Invariance: If a matrix undergoes a linear transformation and the column space of the transformed matrix is the same as the original matrix, the matrix is said to have column invariance.
- Row Invariance: Similarly, if the row space of a matrix remains the same after a linear transformation, it displays row invariance.
Eigenvectors & Eigenvalues
- An eigenvector of a square matrix is a non-zero vector that, when the matrix is multiplied by it, yields a scalar multiple of that vector. The scalar is known as the eigenvalue.
- Eigenvectors are invariant under their given matrices transformation, as multiplying them only changes their scalar (length or magnitude), not their inherent direction.
Example of Invariance
- A good example of content invariance is rotational matrix transformation. If you rotate a square matrix 90 degrees clockwise, the sequence of elements along the main diagonal does not change - hence it is said to be invariant under such rotation.