Computing the determinant

Recall the defining properties of the determinant function from the previous lecture. We will soon show that the function satisfying these properties is unique. Assuming this to be the case, how would we actually compute the determinant of a given matrix?

Theorem 1(Lemma 1).

If has a zero row or column, .

Proof
Let the row of be a zero row.
Let give the matrix . Then,

Theorem 2(Lemma 2).

If is upper/lower triangular, then is the product of its diagonal entries.

Proof:
Let be an upper triangular matrix.

Case 1: One of the diagonal entries of is zero.
Pick the lowest zero on the diagonal, say . All the entries to the left of on row are , since is upper triangular. All the entries to the right of on row can be made using row operations, since . Thus, row can be made a zero row using row operations. From lemma 1 and the property that row operations do not change the determinant, it follows that .

Case 2: None of the diagonal entries of are zero.
Scale row by . This gives us an upper triangular matrix with all the diagonal entries equal to . This can be reduced to using row operations, which has determinant . So, we have
. ❏

Theorem 3(Proposition).

We may compute the determinant of a matrix by using row operations. Let a row reduced form of . Then,

where is the number of rows swapped in getting from to .

This follows immediately from the preceding lemmas and the properties of the determinant.
This also shows that .


Another characterization of the determinant

Multilinearity

Definition 4.

Let and be vector spaces. A function is called multilinear if it is linear in each of the arguments, when the rest of the arguments are kept constant.

For example, is multilinear with , or bilinear.

The determinant can be thought of as a function of the row vectors of a matrix:

Theorem 5(Lemma).

Let be the row vectors of an matrix . Let . Then,
.

Proof
If are linearly dependent, a zero row can be had using row operations in all three matrices, and thus all three determinants are zero. Thus, assume are linearly independent.

Let be linearly dependent. Then, and can be expressed as a linear combination of . Thus, , and since can be gotten rid of using row operations, .

Let be linearly independent. If are linearly dependent or are linearly dependent, the lemma is trivially true. Thus, assume both sets are linearly independent. Also, has to be linearly dependent, since cannot have a linearly independent set of more than vectors. So, , for some linear combination of . Both the left and right and sides simplify to , and we are done. ❏

We will now show that this makes the determinant multilinear. WLOG, fix row vectors . Let defined by . Then,

New defining properties

Thus, we arrive at these alternate defining properties of the determinant:

  1. is multilinear in the rows of .
  2. If has two identical rows then its determinant is zero.
  3. The determinant of the identity matrix is one.

Exercise: Show that these are equivalent to the the previous defining properties.


Cofactor expansions

Definition 6.

Let be an matrix.
The th minor, denoted by is the matrix obtained by deleting the th row and the th column from .
The th cofactor is .

Now, define . Note that it satisfies the defining properties.
Continue to define the determinant of matrices inductively by the recursive formula

The above formula is called the cofactor expansion along the th row.
Exercise: Show that this formula satisfies the defining properties.

Since we are assuming that the function satisfying the defining properties of the determinant is unique, can have only one value. Thus, it follows that the cofactor expansion along any row gives the same result. However, here’s how you prove this without that assumption:

Proof
We can show that the function “cofactor expansion along row 1” satisfies the defining properties of the determinant. Thus, cofactor expansion of along row 1. Consider row . Using row swaps, bring this row to the beginning, and call this new matrix . Note that . Also note that . Now, cofactor expansion along row 1 of , i.e,

which is the cofactor expansion of along row . ❏