Example: bankruptcy

Eigenvalues and Eigenvectors §5.2 Diagonalization

PreviewDiagonalizationExamplesExplicit DiagonalizationEigenvalues and Eigenvectors DiagonalizationSatya Mandal, KUSummer 2017 Satya Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationGoalsSupposeAis square matrix of necessary and sufficient condition when there isan invertible matrixPsuch thatP 1 APis a Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationDefinitionsITwo square matricesA,Bare said to be similar, if thereis an invertible matrixP, such thatA=P square matrixAsaid to be diagonalizable, if there is aninvertible matrixP, such thatP 1 APis a diagonalmatrix.

Preview Diagonalization Examples Explicit Diagonalization Goals Suppose A is square matrix of order n. I Provide necessary and su cient condition when there is an invertible matrix P such that P 1AP is a diagonal matrix. Satya Mandal, KU Eigenvalues and …

Tags:

  Diagonal

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Eigenvalues and Eigenvectors §5.2 Diagonalization

1 PreviewDiagonalizationExamplesExplicit DiagonalizationEigenvalues and Eigenvectors DiagonalizationSatya Mandal, KUSummer 2017 Satya Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationGoalsSupposeAis square matrix of necessary and sufficient condition when there isan invertible matrixPsuch thatP 1 APis a Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationDefinitionsITwo square matricesA,Bare said to be similar, if thereis an invertible matrixP, such thatA=P square matrixAsaid to be diagonalizable, if there is aninvertible matrixP, such thatP 1 APis a diagonalmatrix.

2 That means, ifAis similar to a diagonal matrix,we say thatAis Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationTheorem ,Bare two similar matrices. Then,AandBhavesame | I A|=| I P 1BP|=| (P 1P) P 1BP|=|P 1( I B)P|=|P 1|| I B||P|=|P| 1| I B||P|=| I B|So,AandBhas same characteristic polynomials. So, theyhave same Eigenvalues . The proof is Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationTheorem : DiagonalizabilityWe ask, when a square matrix is diagonalizable?

3 Theorem A square matrixA, of ordern, is diagonalizableif and only ifAhasnlinearly independent are two statements to prove. First, 1AP=D,and henceAP=PDwherePis an invertible matrix andDis a diagonal ,D= 10 00 2 0 00 n ,P=(p1p2 pn)Satya Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationContinuedISinceAP=PA, we haveA(p1p2 pn)=(p1p2 pn) 10 00 2 0 00 n .Or(Ap1Ap2 Apn)=( 1p1 2p2 npn)Satya Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationContinuedISo,Api= ipifori= 1,2, ,nSincePis invertible,pi6=0and hencepiis aneigenvector ofA, for.

4 IAlso,rank(P) = , its columns{p1,p2,..,pn}arelinearly , it is established that ifAis diagonalizable, thenAhasnlinearly independent Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationContinuedINow, we prove the converse. So, we assumeAbas hasnlinearly independent Eigenvectors :{p1,p2,..,pn}ISo,Ap1= 1p1,Ap2= 2p2, ,Apn= npnfor some Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationContinuedIWrite,P=(p1p2 pn)andD= 10 00 2 0 00 n.

5 IIt follows from the equationsApi= ipithatAP= ,P 1AP=Dis proof is Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationSteps for DiagonalizingSupposeAis a square matrix of not havenlinearly independent Eigenvectors ,thenAis not possible, findnlinearly independent eigenvectorsp1,p2, ,pnforAwith corresponding Eigenvalues 1, 2,.., , writeP=(p1p2 pn)andD= 10 00 2 0 00 n IWe haveD=P 1 APis a diagonal Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationCorollary a vectors space andx1,x2.

6 ,xnbe vectors inV. Then,x1,x2,..,xnare linearly dependent if and only ifthere is an integerm nsuch that (1)x1,x2,..,xmarelinearly dependent and (2)xm span(x1,x2,..,xm 1). ,x2,..,xnare linearly dependent. ByTheorem , one of these vectors is a linear combination ofthe rest. By relabeling, we can assumexnis a linearcombination ofx1,x2,..,xn 1. Letm= min{k:xk span(x1,x2,..,xk 1)}Satya Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationContinuedIfx1,x2,..,xm 1are linearly dependent, then we could applyTheorem again, which would lead to a contradiction,thatmis minimum.

7 So,x1,x2,..,xm 1are linearlyindependent. This establishes one way , suppose there is anm nsuch that (1) and (2)holds. Then,xm=c1x1+ +cm 1xm 1for somec1,..,cm 1 RSo,c1x1+ +cm 1xm 1+ ( 1)xm=0which is a nontrivial linear combination. So,x1,x2,..,xm,..,xnare linearly Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationTheorem : With Distinct EigenvaluesLetAbe a square matrixA, of ordern. SupposeAhasndistinct Eigenvalues . ThenIthe corresponding Eigenvectors are linearly independentIandAis second statement follows from the first, by So, we prove the first statement 1, 2.

8 , nbe distinct Eigenvalues , fori= 1,2,..,nwe haveAxi= ixiwherexi6= 0are Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationContinuedIWe need to prove thatx1,x2,..,xnare linearlyindependent. We prove by contra-positive , assume they are linearly Corollary there is anm<nsuch thatx1,x2,..,xmare mutually linearly independent andxm+1is in can be written as a linear combination of{x1,x2,..,xm}.So,xm+1=c1x1+c2x2+ +cmxm(1)Here, at least oneci6= , if needed, wecan assumec16= Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationContinuedIMultiply (1) byAon the left:Axm+1=c1Ax1+c2Ax2+ +cmAxm(2)Now, useAxi= ixi.

9 M+1xm+1= 1c1x1+ 2c2x2+ + mcmxm(3)IAlso, multiply (1) by m+1,we have m+1xm+1= m+1c1x1+ m+1c2x2+ + m+1cmxm(4)Satya Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationContinuedISubtract (3) from (4):( m+1 1)c1x1+( m+1 2)c2x2+ +( m+1 m)cmxm= these vectors are linearly independent, and hence( m+1 i)ci= 0fori= 1,2, , 0 we get m+1 1= 0 or m+1= that is are distinct. So, we conclude thatx1,x2,..,xnare linearly independent. The proof Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationExample 23 10 1 200 3 andP= 11 50 1 100 2.

10 Verify thatAis diagonalizable, by computingP :We do it in a two Use TI to computeP 1= 11 30 .So,P 1AP= 20 00 1 000 3 .So, it is verified thatP 1 APis a diagonal Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationExample (31 9 3).Show thatAis not :Use Theorem and show thatAdoes not have2 linearly independent Eigenvectors . To do this, we have findand count the dimensions of all the eigenspacesE( ).We doit in a few , find all the Eigenvalues . To do this, we solvedet( I A) = 3 19 + 3 = 2= , = 0 is the only eigenvalue Mandal, KUEigenvalues and Eigenvectors DiagonalizationPreviewDiagonalizationExa mplesExplicit DiagonalizationContinuedINow we compute the eigenspaceE(0) of the eigenvalue = haveE(0) is solution space of(0I A)(xy)=(00)or( 3 193)(xy)=(00)Using TI (or by hand), a parametric solution of thissystem is given byx=.


Related search queries