Bootstrap

代数第一章第二节

(1.2.1) Left multiplication by an n ∗ n n*n nn matrix A A A on n ∗ p n*p np matrices, A X = Y AX=Y AX=Y, can be computed by operating on the rows of X X X.

(1.2.2) Y i = a i 1 X 1 + ⋯ + a i n X n Y_i=a_{i1}X_1+\dots+a_{in}X_n Yi=ai1X1++ainXn.

(1.2.3) Left multiplication by an invertible matrix is called a row operation. Some square matrices called elementary matrices are used. There are three types of elementary 2 * 2 matrices:

  • ∣ 1 a 1 ∣ \left|\begin{array}{cc}1&a\\&1\end{array}\right| 1a1 or ∣ 1 a 1 ∣ \left|\begin{array}{cc}1&\\a&1\end{array}\right| 1a1 .
  • ∣ 0 1 1 0 ∣ \left|\begin{array}{cc}0&1\\1&0\end{array}\right| 0110 .
  • ∣ c 1 ∣ \left|\begin{array}{cc}c&\\&1\end{array}\right| c1 or ∣ 1 c ∣ \left|\begin{array}{cc}1&\\&c\end{array}\right| 1c .

(1.2.4)

  • ∣ 1 1 a 1 1 1 ∣ \left|\begin{array}{ccccc}1&&&&\\&1&&a&\\&&1&&\\&&&1&\\&&&&1\end{array}\right| 111a11 or ∣ 1 1 1 a 1 1 ∣ \left|\begin{array}{ccccc}1&&&&\\&1&&&\\&&1&&\\&a&&1&\\&&&&1\end{array}\right| 11a111 .
  • ∣ 1 0 1 1 1 0 1 ∣ \left|\begin{array}{ccccc}1&&&&\\&0&&1&\\&&1&&\\&1&&0&\\&&&&1\end{array}\right| 1011101 .
  • ∣ 1 1 c 1 1 ∣ ( c ≠ 0 ) \left|\begin{array}{ccccc}1&&&&\\&1&&&\\&&c&&\\&&&1&\\&&&&1\end{array}\right|(c\ne 0) 11c11 (c=0).

(1.2.5)

  • with a in the i, j position, add a * (row j) of X X X to (row i)
  • Interchange (row i) and (row j) of X X X
  • Multiply (row i) of X X X by a nonzero scalar c

(1.2.6) Elementary matrices are invertible, and their inverse are alse elementary matrices.

Proof. The inverse of an elementary matrix is the matrix corresponding to the inverse row operation: ‘‘subtract a * (row j) from (row i),’’ ‘‘interchange (row i) and (row j)’’ again, or ‘‘multiply (row i) by c − 1 c^{-1} c1.’’

(1.2.7) Row reduction: M ′ = E k … E 2 E 1 M M'=E_k\dots E_2E_1M M=EkE2E1M

(1.2.8) M = ∣ 1 1 2 1 5 1 1 2 6 10 1 2 5 2 7 ∣ → ∣ 1 1 2 1 5 0 0 0 5 5 0 1 3 1 2 ∣ → ∣ 1 1 2 1 5 0 1 3 1 2 0 0 0 5 5 ∣ → ∣ 1 0 − 1 0 3 0 1 3 1 2 0 0 0 5 5 ∣ → ∣ 1 0 − 1 0 3 0 1 3 1 2 0 0 0 1 1 ∣ → ∣ 1 0 − 1 0 3 0 1 3 0 1 0 0 0 1 1 ∣ = M ′ M=\left|\begin{array}{ccccc}1&1&2&1&5\\1&1&2&6&10\\1&2&5&2&7\end{array}\right|\rightarrow\left|\begin{array}{ccccc}1&1&2&1&5\\0&0&0&5&5\\0&1&3&1&2\end{array}\right|\\\rightarrow\left|\begin{array}{ccccc}1&1&2&1&5\\0&1&3&1&2\\0&0&0&5&5\end{array}\right|\rightarrow\left|\begin{array}{ccccc}1&0&-1&0&3\\0&1&3&1&2\\0&0&0&5&5\end{array}\right|\\\rightarrow\left|\begin{array}{ccccc}1&0&-1&0&3\\0&1&3&1&2\\0&0&0&1&1\end{array}\right|\rightarrow\left|\begin{array}{ccccc}1&0&-1&0&3\\0&1&3&0&1\\0&0&0&1&1\end{array}\right|=M' M= 1111122251625107 100101203151552 100110230115525 100010130015325 100010130011321 100010130001311 =M.

(1.2.9) M = ∣ A B ∣ = ∣ a 11 … a 1 n b 1 ⋮ ⋱ ⋮ ⋮ a m 1 … a m n b m ∣ M=\left|\begin{array}{c|c}A&B\end{array}\right|=\left|\begin{array}{ccc|c}a_{11}&\dots&a_{1n}&b_1\\\vdots&\ddots&\vdots&\vdots\\a_{m1}&\dots&a_{mn}&b_m\end{array}\right| M= AB = a11am1a1namnb1bm , E M = ∣ E A E B ∣ EM=\left|\begin{array}{c|c}EA&EB\end{array}\right| EM= EAEB .

(1.2.10) The systems A ′ X = B ′ A'X = B' AX=B and A X = B AX = B AX=B have the same solutions.

Proof. Since M M M is obtained by a sequence of elementary row operations, there are elementary matrices E 1 , … , E k E_1,\dots,E_k E1,,Ek such that, with P = E k … E 1 , M = E k … E 1 M = P M P = E_k\dots E_1, M = E_k\dots E_1M = PM P=EkE1,M=EkE1M=PM.
The matrix P P P is invertible, and M ′ = [ A ′ ∣ B ′ ] = [ P A ∣ P B ] M' = [A'|B'] = [PA|PB] M=[AB]=[PAPB]. If X is a solution of the original equation A X = B AX = B AX=B, we multiply by P P P on the left: P A X = P B PAX = PB PAX=PB, which is to say, A ′ X = B ′ A'X = B' AX=B. So X X X also solves the new equation. Conversely, if A ′ X = B ′ A'X = B' AX=B, then P − 1 A ′ X = P − 1 B ′ P^{-1}A'X = P^{-1}B' P1AX=P1B, that is, A X = B AX = B AX=B.

(1.2.11) { x 1 + x 2 + 2 x 3 + x 4 = 5 x 1 + x 2 + 2 x 3 + 6 x 4 = 10 x 1 + 2 x 2 + 5 x 3 + 2 x 4 = 7 → { x 1 − x 3 = 3 x 2 + 3 x 3 = 1 x 4 = 1 \begin{cases}x_1+x_2+2x_3+x_4=5\\x_1+x_2+2x_3+6x_4=10\\x_1+2x_2+5x_3+2x_4=7\end{cases}\rightarrow \begin{cases}x_1-x3=3\\x_2+3x_3=1\\x_4=1\end{cases} x1+x2+2x3+x4=5x1+x2+2x3+6x4=10x1+2x2+5x3+2x4=7 x1x3=3x2+3x3=1x4=1

(1.2.12)

(a) If (row i) of M is zero, then (row j) is zero for all j > i.

(b) If (row i) isn’t zero, its first nonzero entry is 1. This entry is called a pivot.

(c) The (row (i+1)) isn’t zero, the pivot in (row (i+1)) is to the right of the pivot in (row i).

(d) The entries above a pivot are zero. (The entries below a pivot are zero too, by ©).

(1.2.13) Let M ′ = [ A ′ ∣ B ′ ] M' = [A'|B'] M=[AB] be a block row echelon matrix, where B ′ B' B is a column vector. The system of equations A ′ X = B ′ A'X = B' AX=B has a solution if and only if there is no pivot in the last column B ′ B' B. In that case, arbitrary values can be assigned to the unknown x i x_i xi, provided that (column i) does not contain a pivot. When these arbitrary values are assigned, the other unknowns are determined uniquely.

(1.2.14) Every system A X = 0 AX = 0 AX=0 of m m m homogeneous equations in n n n unknowns, with m < n m < n m<n, has a solution X X X in which some x i x_i xi is nonzero.

Proof. Row reduction of the block matrix [ A ∣ 0 ] [A|0] [A∣0] yields a matrix [ A ′ ∣ 0 ] [A'|0] [A∣0] in which A ′ A' A is in row echelon form. The equation A ′ X = 0 A'X = 0 AX=0 has the same solutions as A X = 0 AX = 0 AX=0. The number, say r r r, of pivots of A ′ A' A is at most equal to the number m m m of rows, so it is less than n n n. The proposition tells us that we may assign arbitrary values to n − r n − r nr variables xi.

(1.2.15) A square row echelon matrix M M M is either the identity matrix I I I, or else its bottom row is zero.

Proof. Say that M is an n ∗ n n*n nn row echelon matrix. Since there are n n n columns, there are at most n n n pivots, and if there are n n n of them, there has to be one in each column. In this case, M = I M = I M=I. If there are fewer than n n n pivots, then some row is zero, and the bottom row is zero too.

(1.2.16) Let A A A be a square matrix. The following conditions are equivalent:

(a) A A A can be reduced to the identity by a sequence of elementary row operations.

(b) A A A is a product of elementary matrices.
(c) A A A is invertible.

Proof. We prove the theorem by proving the implications (a) ⇒ (b) ⇒ © ⇒ (a). Suppose that A A A can be reduced to the identity by row operations, say E k … E 1 A = I E_k\dots E_1A = I EkE1A=I. Multiplying both sides of this equation on the left by E 1 − 1 … E k − 1 E_1^{-1}\dots E_k^{-1} E11Ek1, we obtain A = E 1 − 1 … E k − 1 A=E_1^{-1}\dots E_k^{-1} A=E11Ek1. Since the inverse of an elementary matrix is elementary, (b) holds, and therefore (a) implies (b). Because a product of invertible matrices is invertible, (b) implies ©. Finally, we prove the implication © ⇒ (a). If A A A is invertible, so is the end result A ′ A' A of its row reduction. Since an invertible matrix cannot have a row of zeros, Lemma 1.2.15 shows that A ′ A' A is the identity.

(1.2.17) Let A A A be an invertible matrix. To compute its inverse, one may apply elementary row operations E 1 , … , E k E_1,\dots,E_k E1,,Ek to A A A, reducing it to the identity matrix. The same sequence of operations, when applied to the identity matrix I I I, yields A − 1 A^{-1} A1.

(1.2.18) We invert the matrix A = ∣ 1 5 2 6 ∣ A=\left|\begin{array}{cc}1&5\\2&6\end{array}\right| A= 1256 . To do this, we form the 2 * 4 block matrix ∣ 1 5 1 0 2 6 0 1 ∣ \left|\begin{array}{cc|cc}1&5&1&0\\2&6&0&1\end{array}\right| 12561001 .We perform row operations to reduce A A A to the identity, carrying the right side along, and thereby end up with A − 1 A^{-1} A1 on the right.

(1.2.19) ∣ A I ∣ = ∣ 1 5 1 0 2 6 0 1 ∣ → ∣ 1 5 1 0 0 − 4 − 2 1 ∣ → ∣ 1 5 1 0 0 1 1 2 − 1 4 ∣ → ∣ 1 0 − 3 2 5 4 0 1 1 2 − 1 4 ∣ = ∣ I A − 1 ∣ \left|\begin{array}{cc}A&I\end{array}\right|=\left|\begin{array}{cc|cc}1&5&1&0\\2&6&0&1\end{array}\right|\rightarrow\left|\begin{array}{cc|cc}1&5&1&0\\0&-4&-2&1\end{array}\right|\rightarrow\left|\begin{array}{cc|cc}1&5&1&0\\0&1&\frac{1}{2}&-\frac{1}{4}\end{array}\right|\rightarrow\left|\begin{array}{cc|cc}1&0&-\frac{3}{2}&\frac{5}{4}\\0&1&\frac{1}{2}&-\frac{1}{4}\end{array}\right|=\left|\begin{array}{c|c}I&A^{-1}\end{array}\right| AI = 12561001 10541201 1051121041 100123214541 = IA1

(1.2.20) Let A A A be a square matrix that has either a left inverse or a right inverse, a matrix B such that either B A = I BA = I BA=I or A B = I AB = I AB=I. Then A A A is invertible, and B B B is its inverse.

Proof. Suppose that A B = I AB = I AB=I. We perform row reduction on A A A. Say that A ′ = P A A' = PA A=PA, where P = E k … E 1 P = E_k\dots E_1 P=EkE1 is the product of the corresponding elementary matrices, and A ′ A' A is a row echelon matrix. Then A ′ B = P A B = P A'B = PAB = P AB=PAB=P. Because P P P is invertible, its bottom row isn’t zero. Then the bottom row of A ′ A' A can’t be zero either. Therefore A ′ A' A is the identity matrix (1.2.15), and so P P P is a left inverse of A A A. Then A A A has both a left inverse and a right inverse, so it is invertible and B B B is its inverse.
If B A = I BA = I BA=I, we interchange the roles of A A A and B B B in the above reasoning. We find that B B B is invertible and that its inverse is A A A. Then A A A is invertible, and its inverse is B B B.

(1.2.21) Square systems: The following conditions on a square matrix A A A are equivalent:

(a) A A A is invertible.

(b) The system of equations A X = B AX=B AX=B has a unique solution for every column of B B B.

(c) The system of homogeneous euqations A X = 0 AX=0 AX=0 has only the trival solution X = 0 X=0 X=0.

Proof. Given the system A X = B AX = B AX=B, we reduce the augmented matrix [ A ∣ B ] [A|B] [AB] to row echelon form [ A ′ ∣ B ′ ] [A'|B'] [AB]. The system A ′ X = B ′ A'X = B' AX=B has the same solutions. If A A A is invertible, then A ′ A' A is the identity matrix, so the unique solution is X = B ′ X = B' X=B. This shows that (a) ⇒ (b). If an n ∗ n n * n nn matrix A A A is not invertible, then A ′ A' A has a row of zeros. One of the equations making up the system A ′ X = 0 A'X = 0 AX=0 is the trivial equation. So there are fewer than n n n pivots. The homogeneous system A ′ X = 0 A'X = 0 AX=0 has a nontrivial solution (1.2.13), and so does A X = 0 AX = 0 AX=0 (1.2.14). This shows that if (a) fails, then © also fails, hence that © ⇒ (a). Finally, it is obvious that (b) ⇒ ©.

;