Example: bachelor of science

Section 3.3. Matrix Rank and the Inverse of a Full Rank Matrix

Matrix rank and the Inverse of a Full rank Matrix1 Section rank and the Inverseof a Full rank lengthy Section (21 pages in the text) gives a thorough study of therank of a Matrix (and Matrix products) and considers inverses of matrices brieflyat the that the row space of a matrixAis the span of the row vectors ofAand the row rank ofAis the dimension of this row space. Similarly, the columnspace ofAis the span of the column vectors ofAand the column rank is thedimension of this column space. You will recall that the dimension of the columnspace and the dimension of the row space of a given Matrix are the same (seeTheorem of Fraleigh and Beauregard sLinear Algebra, 3rd Edition, Addison-Wesley Publishing Company, 1995, The rank of a Matrix ). We now give aproof of this based in part on Gentle s argument and on Peter Lancaster sTheoryof Matrices, Academic Press (1969), page 42. First, we need a {ai}ki=1={[ai1, ai2, .. , ain]}ki=1be a set of vectors inRnand let Sn.

3.3. Matrix Rank and the Inverse of a Full Rank Matrix 1 Section 3.3. Matrix Rank and the Inverse of a Full Rank Matrix Note. The lengthy section (21 pages in the text) gives a thorough study of the rank of a matrix (and matrix products) and considers inverses of …

Tags:

  Product, Matrix, Rank, Inverse, Product matrix, Matrix rank and the inverse of

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Section 3.3. Matrix Rank and the Inverse of a Full Rank Matrix

1 Matrix rank and the Inverse of a Full rank Matrix1 Section rank and the Inverseof a Full rank lengthy Section (21 pages in the text) gives a thorough study of therank of a Matrix (and Matrix products) and considers inverses of matrices brieflyat the that the row space of a matrixAis the span of the row vectors ofAand the row rank ofAis the dimension of this row space. Similarly, the columnspace ofAis the span of the column vectors ofAand the column rank is thedimension of this column space. You will recall that the dimension of the columnspace and the dimension of the row space of a given Matrix are the same (seeTheorem of Fraleigh and Beauregard sLinear Algebra, 3rd Edition, Addison-Wesley Publishing Company, 1995, The rank of a Matrix ). We now give aproof of this based in part on Gentle s argument and on Peter Lancaster sTheoryof Matrices, Academic Press (1969), page 42. First, we need a {ai}ki=1={[ai1, ai2, .. , ain]}ki=1be a set of vectors inRnand let Sn.

2 Then the set of vectors{ai}ki=1is linearly independent if and only if the setof vectors{[ai (1), ai (2), .. , ai (n)]}ki=1is linearly independent. That is, permutingall the entries in a set of vectors by the same permutation preserves the lineardependence/independence of the Matrix rank and the Inverse of a Full rank Matrix2 Theorem ann mmatrix. Then the row rank ofAequals thecolumn rank ofA. This common quantity is called thatV(A) denotes the column space of matrixA(see page 41 ofthe text) and soV(AT) is the row space ofA. So from the definition of rank andTheorem , we can conclude that rank (A) = dim(V(A)), rank (A) = rank (AT),and dim(V(A)) = dim(V(AT)). Matrix is offull rankif its rank is the same as its smaller Matrix that is not full rank isrank deficientand therank deficiencyis thedifference between its smaller dimension and the rank . A full rank Matrix which issquare isnonsingular. A square Matrix which is not nonsingular previous definition of singular/nonsingular may be new to you.

3 Laterwe will relate nonsingular and invertibility, as you expect (see Note below).Theorem products of elementary matrices then rank (P AQ) = rank (A). the rank ofInisn, then Theorem implies withA=Inthat eachelementary Matrix is full Matrix rank and the Inverse of a Full rank that Exercise states: LetAbe a set ofn-vectors.(i)IfB Athen dim(span(B)) dim(span(A)).(ii)IfA=A1 A2then dim(span(A)) dim(span(A1)) + dim(span(A2)).(iii)IfA=A1 A2(so thatA1 A2) for vector spacesA,A1, andA2, thendim(A) = dim(A1) + dim(A2).These observations are useful in proving the next a Matrix partitioned asA= A11A12A21A22 . Then(i) rank (Aij) rank (A) fori, j {1,2}.(ii) rank (A) rank ([A11|A12]) + rank ([A21|A22]).(iii) rank (A) rank A11A21 + rank A12A22 .(iv)IfV([A11|A12]T) V([A21|A22]T) then rank (A) = rank ([A11|A12])+ rank ([A21|A22])and ifV A11A21 V A12A22 thenrank(A) = rank A11A21 + rank A12A22 . Exercise , it is to be shown using Theorem (iv) that for a blockdiagonal matrixA= diag(A11, A22.)

4 , Akk) (see Section ), we have rank (A) = rank (A11) + rank (A22) + + rank (Akk). Matrix rank and the Inverse of a Full rank Matrix4 Theorem ann kmatrix andBbe ak mmatrix. Thenrank(AB) min{ rank (A), rank (B)}. Theorem , forx Rnandy Rm, the outer productxyTsatisfiesrank(xyT) min{ rank (x), rank (yT)}= mmatrices. Then| rank (A) rank (B)| rank (A+B) rank (A) + rank (B). mmatrixAis of rankr, then it hasrlinearly independent there is a permutation matrixE 1such thatE 1 Ais a Matrix whose firstrrows are linearly independent (and certainly the choice ofE 1is not unique). SinceE 1 Ais rankr(by Theorem ), it hasrlinearly independent columns (byTheorem ) and there is permutation matrixE 2such thatE 1AE 2is a matrixwhose firstrcolumns are linearly independent. The matrixB=E 1AE 2canthen be partitioned in a way that isolates linearly independent sub-rows and sub-columns. ann mmatrix of rankrwhose firstrrows are linearlyindependent and whose firstrcolumns are linearly independent. Then the par-titioning ofBasB= W XY Z , whereWis ar rfull rank submatrix,Xisr (m r),Yis (n r) r, andZis (n r) (m r), is afull rank Matrix rank and the Inverse of a Full rank mmatrixAis of rankrthen for anyq r(withE 1andE 2asdescribed in the previous note) we haveE 1AE 2= S TU V whereSis aq qfull rank Matrix ,Tisq (m q),Uis (n q) q, andVis (n q) (m q).

5 Ofnlinear equationsinmunknownsx1, x2, .. , xmis a system ofthe forma11x1+a12x2+ +a1mxm=b1a21x1+a22x2+ +a2mxm= +an2x2+ +anmxm= [aij],b Rnwith entriesbi, andx Rmwith entriesxj(so thatbandxare column vectors by our convention, see Section ), the system can be writtenasAx= thecoefficient Matrix . For a givenAandb, a vectorx Rmfor whichAx=bis asolutionto the system of equations. If a solution exists,the system of equations isconsistent. If a solution does not exists, the system ofequations sophomore linear algebra, you used elementary row operations to exploresolutions to systems of equations. Here, we use the rank of the coefficient matrixand the existence of an Inverse of the coefficient Matrix to explore solutions tosystems of Matrix rank and the Inverse of a Full rank from Section (see page 5 of the class notes) that witha1, a2, .. , amas the columns ofAandx Rma vector of scalarsx1, x2, .. , xmwe haveAx= mi=1xiai. So for any givenx Rm, the vectorAxis a linear combination of thecolumns ofAand so having a solutionxto the systemAx=bis equivalent tosaying thatbis in the column space,V(A), systemAx=bis consistent if and only if rank ([A|b]) = rank (A).

6 Thisholds because ofb V(A) implies rank (A) = rank ([A|b]) and rank ([A|b]) = rank (A) impliesb V(A). Also, if forn mmatrixA, wheren m, we haverank(A) =n(so thatAis of full row rank ), then rank ([A|b]) =n(since [A|b] isn (m+ 1)) and so, in this case, the systemAx=bis consistent for anyb Matrix [A|b] is called theaugmented matrixfor the systemAx= an nnonsingular Matrix (that is,Ais square and full rank ).Then for theith unit vectorei Rn,ei V(A) and soAxi=eihas a solutionxifori= 1,2, .. , n. Creatingn nmatrixXwith columnsxiandIn(with columnsei),we can write thesensystems of equations as the Matrix equationAX=In. SinceAX=In,Xis a right Inverse ofA. Since rank (In) min{ rank (A), rank (X)}byTheorem , andn= rank (In) = rank (A), then it must be that rank (X) = a similar argument shows thatXhas a right Inverse , sayXY=In. But thenA=AIn=A(XY) = (AX)Y=InY=Yand soXA=XY=In, andXis also a left Inverse Matrix rank and the Inverse of a Full rank nfull rank matrixA, the matrixBsuch thatBA=AB=Inis theinverseof matrixA, denotedB=A 1.

7 (Of courseA 1is unique for a givenmatrixA.)Theorem ann nfull rank Matrix . Then (A 1)T= (AT) uses some unusual notation. He denotes (AT) 1= (AT) 1=A sometimes denotesAB 1asA/B(UGH!) andB 1 AasB\A. I will avoidthis mmatrixA, wheren m, has a right Inverse if and onlyifAis of full row mmatrixA, wherem n, has a left Inverse if andonly ifAhas full column shows that a square Matrix is nonsingular if and onlyif it is mmatrix, ifn nmatrixAATis of full rank , then (AAT) 1exists and the right Inverse ofAisAT(AAT) 1sinceAAT(AAT) 1=In. Similarly,ifATAis of full rank , then the left Inverse ofAis (ATA) 1 ATsince (ATA) 1 ATA= Matrix rank and the Inverse of a Full rank the same size that have the same rank areequivalent, denotedA B. Form nmatrixAwith rankrwhere 0< r min{n, m}, then theequivalent canonical form ofAis then mmatrix Ir00 0 .Theorem ann mmatrix of rankr >0 then there are matricesPandQ, both products of elementary matrices, such thatP AQis the equivalentcanonical form ofA,P AQ= Ir00 0.

8 Symmetric, then the same operations in the proof of Theorem performed on the rows which are performed on the columns so that we haveP APT= Ir00 0 forPa product of elementary dealt with row equivalence in Linear Algebra. This equivalence is acombination of row equivalence and column equivalence so thatA Bif and onlyifB=P AQwherePandQare products of some elementary matrixRis inrow echelon form( REF ) if(1)rij= 0 fori > j, and(2)ifkis such thatrik6= 0 andri`= 0 for` < kthenri+1,j= 0 forj Matrix rank and the Inverse of a Full rank see that this definition of row echelon form is consistent with yoursophomore Linear Algebra experience, notice that Condition (1) implies that thereare only 0 s below the main diagonal. Condition (2) implies thatrikis the firstnonzero entry in rowi(called the pivot ) and that the first nonzero entry in rowi+ 1 lies to the right of pivotrik(that is, in a column of index greater thank).Theorem any matrixAthere is a matrixPa product of elementarymatrices such thatP Ais in row echelon proof of Theorem follows by applying the technique ofGauss-Jordan elimination.

9 An algorithmic explanation of Gauss-Jordan elimination canbe found in my online notes for Linear Algebra (MATH 2010) SolvingSystems of Linear does not define reduced row echelon form of a Matrix in which thematrix is in row echelon form where each pivot is 1 and all entries above the pivotsare 0. We can use Gentle s approach to define this as matrixRisreduced row echelon form( RREF ) if it is in rowechelon form and(3)ifkis such thatrik6= 0 andri`= 0 for` < kthenrik= 1 andrjk= 0 forj < Matrix rank and the Inverse of a Full rank (square) upper triangular matrixHis inHermite formif(1)hii= 0 or 1,(2)ifhii= 0 thenhij= 0 for allj, and(3)ifhii= 1, thenhki= 0 for allk6= in Hermite form the Condition (1) implies that the main diagonalentries are 0 s and 1 s. Condition (2) implies that the rows containing a 0 diagonalentry are all 0 s. Condition (3) implies that the columns containing 1 diagonalentry has all other entries 0. Notice that a diagonal entryhii= 0 may havenonzero entries above it in columni.

10 For example,H= 1 20 0 is in Hermiteform. In Exercise it is shown that ifHis in Hermite form thenH2= a square full rank Matrix (that is, nonsingular) andifBandCare conformable matrices for the multiplicationsABandCAthenrank(AB) = rank (B) and rank (CA) = rank (C). fact, Theorem can be extended to nonsquare matrices as a full column rank Matrix andBis conformable for themultiplicationAB, then rank (AB) = rank (B). IfAis a full row rank Matrix andCis conformable for the multiplicationCA, then rank (CA) = rank (C). Matrix rank and the Inverse of a Full rank then nsymmetric matrixAis positive definite if for eachx Rnwithx6= 0 we have that the quadratic form satisfiesxTAx >0. The next resultshows that positive definiteness is preserved under a particular type of multiplica-tion by a full rank nand positive definite and letAben m.(1)IfCis positive definite andAis of full column rankm nthenATCA ispositive definite.(2)IfATCAis positive definite thenAis of full column rankm ann mmatrix.


Related search queries