3.6 Invertible Matrix Theorem¶
Invertible Matrix Theorem (IMT). Let \(A\) be an \(n\times n\) matrix, and let \(T:\mathbb R^n\rightarrow \mathbb R^n\) be the matrix transformation \(T(\vec x)=A\vec x\). The following statements are equivalent:
\(A\) is invertible.
\(A\) has \(n\) pivots.
\(\text{Null}(A)={\vec 0}\).
The columns of \(A\) are linearly independent.
The columns of \(A\) span \(\mathbb R^n\).
\(A\vec x=\vec b\) has a unique solution for each \(b\) in \(\mathbb R^n\).
The linear transformation \(T\) is invertible, specifically:
\(T\) is one-to-one.
\(T\) is onto.
Using the IMT¶
We use the IMT all the time. Is a set of vectors a basis for \(\mathbb R^n\)? Combine them into a matrix and check to see if the matrix is invertible. Is the linear transformation one-to-one? Verify the null space contains only the \(\vec 0\). Does \(A\) have an inverse? Row reduce and check how many pivots the matrix has.
Examples¶
Example 1¶
Determine if the linear transformation \(T\) is one-to-one. Is it onto? The standard matrix associated with \(T\) is
A = [0 -2 -3 9 ; 0 0 0 1 ; -2 2 5 -15 ; 1 -1 -2 7 ];
rref(A)
ans =
1 0 0 0
0 1 0 0
0 0 1 0
0 0 0 1
With 4 pivots, we know the dimension of the null is zero and thus that the transformation is both one-to-one and onto (invertible).
Example 2¶
Is the set of vectors a basis for \(\mathbb R^5\)?
We create the matrix \(B = [\vec b_1,\vec b_2, \vec b_3 , \vec b_4 , \vec b_5 ]\) and row reduce.
B = [1 2 -2 2 2 ; -2 0 -1 1 0 ; 2 5 3 2 3 ; -1 4 3 4 1 ; 3 5 4 3 2 ];
inv(B)
ans =
1.0000 -4.0000 0.0000 2.0000 -2.0000
-25.0000 110.0000 -7.0000 -61.0000 66.0000
10.0000 -45.0000 3.0000 25.0000 -27.0000
12.0000 -52.0000 3.0000 29.0000 -31.0000
23.0000 -101.0000 7.0000 56.0000 -61.0000
Since \(B\) is invertible, we know the columns are linearly independent and thus form a basis for \(\mathbb R^5\).