Gaussian Elimination

Algorithm which takes a matrix $A \in F^{m \times n}$ and applies a sequence of invertible elementary row operations (row reordering, scaling, and replacement) $E \in F^{k \times m}$ to $A$, producing its row echelon form. Furthermore, the algorithm can be extended to produce its row-reduced echelon form.

Gaussian Elimination (**A**):
		for **row** i = 1 to j:
				1. find the leftmost nonzero column of **A**                          ; **pivot column
				2.** apply row operations until the ith entry is nonzero                   ; **pivot
				3.** eliminate all non-zero entries below the pivot using row operations

<aside> <img src="/icons/bookmark_gray.svg" alt="/icons/bookmark_gray.svg" width="40px" />

Definition. The row echelon form of $A \in F^{m \times n}$ is the transformation of $A$ to a row equivalent matrix $B$, that is to say $B = EA$ for some sequence of row operations $E$, s.t.

  1. All zero rows of $B$ are below all non-zero rows.
  2. For any non-zero row, its pivot is to the right of the pivot of all preceding rows.

Furthermore, if a row equivalent matrix $C$ satisfies the above along with the requirements that each pivot is $1$ and all entries above each pivot are zero, then $C$ is called the reduced row echelon form of $A$.

</aside>

The main motivation behind finding such forms of matrices comes from augmented matrices, which concatenate two matrices $A$ and $B$ together into $(A | B)$, allowing for the left-inverse to be solved for (if it exists). This is because the reduced row echelon form of an invertible matrix is simply the identity, meaning that the sequence of row operations applied to it is its inverse.

$$ AA^{-1} = I \rightarrow EAA^{-1} = A^{-1} = E $$

<aside> <img src="/icons/castle_yellow.svg" alt="/icons/castle_yellow.svg" width="40px" />

Theorem. For $A \in F^{m \times n}$, $x \in F^n$, $y \in F^{m}$ and any row echelon form $B$ of $A$,

  1. Uniqueness of Solutions. A solution $\hat{x}$ to $Ax = y$ is unique iff $B$ has a pivot in each column.
  2. Consistency. The system $Ax = y$ is consistent (i.e. $y \in \text{Col}(A)$) iff $B$ has a pivot in each row.
  3. Uniqueness of All Solutions. For all $y \in F^n$, a solution to $Ax = y$ is unique iff $B$ has a pivot in each column and row.
  4. Inconsistency. For $(A|y) \sim (B|z)$, the system $Ax = y$ is inconsistent iff $(B|z)$ has a pivot in the last column. </aside>

<aside> <img src="/icons/castle_yellow.svg" alt="/icons/castle_yellow.svg" width="40px" />

Theorem. For $A \in F^{m \times n}$ s.t. $A = ( w_1 \dots w_n)$ and any row echelon form $B$ of $A$,

  1. Column Independence. $w_1, \dots, w_n$ is linearly independent iff $B$ has a pivot in every column.
  2. Column Spanning. $w_1, \dots, w_n$ is spanning iff $B$ has a pivot in every row. </aside>

Change of Basis

For a finite dimensional vector space $\mathbf{V}$ with bases $\beta$ and $\gamma$, a change in basis from $\beta$ to $\gamma$ with respect to an operator $T \in \mathscr{L}(\mathbf{V})$ is a change in the coordinates its associated matrix is defined in terms of.

$$ [T]\gamma = [I]\beta^\gamma [T]\beta [I]\gamma ^\beta $$

image.png

<aside> <img src="/icons/chess-pawn_brown.svg" alt="/icons/chess-pawn_brown.svg" width="40px" />

Proposition. The inverse of the identity from $\beta$ to $\gamma$ is the identity from $\gamma$ to $\beta$.

$$ ([I]\beta ^\gamma)^{-1} = [I]\gamma ^\beta $$


Dual Spaces

For a finite-dimensional vector space $\mathbf{V}$ over a field $F$, its dual space $\mathbf{V}'$ is the space of all linear functionals, linear maps from a vector space to its underlying field, from $\mathbf{V}$ to $F$.