Linear algebra is a branch of mathematics that studies vectors, vector spaces (also called linear spaces), linear transformations, and systems of linear equations. It provides a framework for understanding how to solve linear equations and perform vector operations, which are fundamental in various scientific and engineering disciplines.
Vector Spaces
A vector space \( V \) over a field \( \mathbb{{F}} \) (such as the real numbers \( \mathbb{{R}} \) or complex numbers \( \mathbb{{C}} \)) is a set equipped with two operations: vector addition and scalar multiplication. These operations must satisfy the following axioms:
1. Associativity of addition: For all \( \mathbf{{u}}, \mathbf{{v}}, \mathbf{{w}} \in V \),
$$ (\mathbf{{u}} + \mathbf{{v}}) + \mathbf{{w}} = \mathbf{{u}} + (\mathbf{{v}} + \mathbf{{w}}) $$
2. Commutativity of addition: For all \( \mathbf{{u}}, \mathbf{{v}} \in V \),
$$ \mathbf{{u}} + \mathbf{{v}} = \mathbf{{v}} + \mathbf{{u}} $$
3. Identity element of addition: There exists an element \( \mathbf{{0}} \in V \) such that for all \( \mathbf{{u}} \in V \),
$$ \mathbf{{u}} + \mathbf{{0}} = \mathbf{{u}} $$
4. Inverse elements of addition: For each \( \mathbf{{u}} \in V \), there exists an element \( -\mathbf{{u}} \in V \) such that
$$ \mathbf{{u}} + (-\mathbf{{u}}) = \mathbf{{0}} $$
5. Compatibility of scalar multiplication with field multiplication: For all \( a, b \in \mathbb{{F}} \) and \( \mathbf{{u}} \in V \),
$$ a(b\mathbf{{u}}) = (ab)\mathbf{{u}} $$
6. Identity element of scalar multiplication: For all \( \mathbf{{u}} \in V \),
$$ 1\mathbf{{u}} = \mathbf{{u}} $$
7. Distributivity of scalar multiplication with respect to vector addition: For all \( a \in \mathbb{{F}} \) and \( \mathbf{{u}}, \mathbf{{v}} \in V \),
$$ a(\mathbf{{u}} + \mathbf{{v}}) = a\mathbf{{u}} + a\mathbf{{v}} $$
8. Distributivity of scalar multiplication with respect to field addition: For all \( a, b \in \mathbb{{F}} \) and \( \mathbf{{u}} \in V \),
$$ (a + b)\mathbf{{u}} = a\mathbf{{u}} + b\mathbf{{u}} $$
Linear Transformations
A linear transformation (or linear map) is a function \( T: V \to W \) between two vector spaces \( V \) and \( W \) that preserves the operations of vector addition and scalar multiplication. This means that for all \( \mathbf{{u}}, \mathbf{{v}} \in V \) and all scalars \( c \in \mathbb{{F}} \):
$$ T(\mathbf{{u}} + \mathbf{{v}}) = T(\mathbf{{u}}) + T(\mathbf{{v}}) $$
$$ T(c\mathbf{{u}}) = cT(\mathbf{{u}}) $$
Matrices
Matrices are a convenient way to represent linear transformations. An \( m \times n \) matrix is a rectangular array of numbers with \( m \) rows and \( n \) columns. The entry in the \( i \)-th row and \( j \)-th column of a matrix \( A \) is denoted by \( a_{{ij}} \). If \( A \) is an \( m \times n \) matrix, and \( \mathbf{{x}} \) is a vector in \( \mathbb{{F}}^n \), the product \( A\mathbf{{x}} \) is a vector in \( \mathbb{{F}}^m \) defined by:
$$ (A\mathbf{{x}})_i = \sum_{{j=1}}^n a_{{ij}} x_j $$
for \( i = 1, 2, \ldots, m \).
Determinants
The determinant is a scalar value that can be computed from the elements of a square matrix and encodes certain properties of the matrix. For a \( 2 \times 2 \) matrix \( A = \begin{pmatrix} a & b \\ c & d \end{pmatrix} \), the determinant is defined as:
$$ \det(A) = ad – bc $$
For a \( 3 \times 3 \) matrix \( A = \begin{pmatrix} a & b & c \\ d & e & f \\ g & h & i \end{pmatrix} \), the determinant is defined as:
$$ \det(A) = aei + bfg + cdh – ceg – bdi – afh $$
Eigenvalues and Eigenvectors
An eigenvector of a square matrix \( A \) is a nonzero vector \( \mathbf{{v}} \) such that multiplication by \( A \) alters only the scale of \( \mathbf{{v}} \):
$$ A\mathbf{{v}} = \lambda\mathbf{{v}} $$
where \( \lambda \) is a scalar known as the eigenvalue corresponding to the eigenvector \( \mathbf{{v}} \). To find the eigenvalues of a matrix \( A \), we solve the characteristic equation:
$$ \det(A – \lambda I) = 0 $$
where \( I \) is the identity matrix of the same dimension as \( A \).