Example: dental hygienist

Linear Algebra: Graduate Level Problems and Solutions

Linear algebra : Graduate Level Problems and SolutionsIgor Yanovsky1 Linear AlgebraIgor Yanovsky, 20052 Disclaimer:This handbook is intended to assist Graduate students with qualifyingexamination preparation. Please be aware, however, that the handbook might contain,and almost certainly contains, typos as well as incorrect or inaccurate Solutions . I cannot be made responsible for any inaccuracies contained in this AlgebraIgor Yanovsky, 20053 Contents1 Basic Linear Maps .. Linear Maps as Matrices .. Dimension and Isomorphism .. Matrix Representations Redux .. Subspaces .. Linear Maps and Subspaces .. Dimension Formula .. Matrix Calculations .. Diagonalizability ..82 Inner Product Inner Products .. Orthonormal Bases .. Gram-Schmidt procedure .. QR Factorization .. Orthogonal Complements and Projections ..93 Linear Maps on Inner Product Adjoint Maps .. Self-Adjoint Maps .. Polarization and Isometries .. Unitary and Orthogonal Operators.

Linear Algebra Igor Yanovsky, 2005 8 1.9 Diagonalizability Deflnition. Let T be a linear operator on the flnite-dimensional space V. T is diagonalizable if there is a basis for V consisting of eigenvectors of T. Theorem. Let v1;:::;vn be nonzero eigenvectors of distinct eigenvalues ...

Tags:

  Algebra

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Linear Algebra: Graduate Level Problems and Solutions

1 Linear algebra : Graduate Level Problems and SolutionsIgor Yanovsky1 Linear AlgebraIgor Yanovsky, 20052 Disclaimer:This handbook is intended to assist Graduate students with qualifyingexamination preparation. Please be aware, however, that the handbook might contain,and almost certainly contains, typos as well as incorrect or inaccurate Solutions . I cannot be made responsible for any inaccuracies contained in this AlgebraIgor Yanovsky, 20053 Contents1 Basic Linear Maps .. Linear Maps as Matrices .. Dimension and Isomorphism .. Matrix Representations Redux .. Subspaces .. Linear Maps and Subspaces .. Dimension Formula .. Matrix Calculations .. Diagonalizability ..82 Inner Product Inner Products .. Orthonormal Bases .. Gram-Schmidt procedure .. QR Factorization .. Orthogonal Complements and Projections ..93 Linear Maps on Inner Product Adjoint Maps .. Self-Adjoint Maps .. Polarization and Isometries .. Unitary and Orthogonal Operators.

2 Spectral Theorem .. Normal Operators .. Unitary Equivalence .. Triangulability ..164 Characteristic Polynomial ..175 Linear Dual Spaces .. Dual Maps ..226 Problems23 Linear AlgebraIgor Yanovsky, 200541 Basic Linear Matmxn(F)andB Matnxm(F), thentr(AB) =tr(BA). that the (i, i) entry inABis nj=1 ij ji, while (j, j) entry inBAis mi=1 ji (AB) =m i=1n j=1 ij ji,tr(BA) =n j=1m i=1 ji Linear Maps as { 0+ 1t+ + ntn: 0, 1, .. , n F}be the space ofpolynomials of degree nandD:V Vthe differential mapD( 0+ 1t+ + ntn) = 1+ +n ntn we use the basis1, t, .. , tnforVthen we see thatD(tk) =ktk 1and thusthe(n+ 1)x(n+ 1)matrix representation is computed via[D(1)D(t)D(t2) D(tn)] = [0 1 2t ntn 1] = [1t t2 tn] 0 1 0 00 0 2 00 0 0 0 0 Dimension and IsomorphismA Linear mapL:V Wisisomorphismif we can findK:W Vsuch thatLK=IWandKL= WIVx x IWVK WLinear AlgebraIgor Yanovsky, isomorphic there is a bijective Linear mapL:V IfVandWare isomorphic we can find Linear mapsL:V WandK:W Vso thatLK=IWandKL=IV.

3 Then for anyy=IW(y) =L(K(y)) sowe can letx=K(y), which meansLis onto. IfL(x1) =L(x2) thenx1=IV(x1) =KL(x1) =KL(x2) =IV(x2) =x2, which meansLis 1 1. AssumeL:V Wis Linear and a bijection. Then we have an inverse mapL 1which satisfiesL L 1=IWandL 1 L=IV. In order for this inverse map to beallowable asKwe need to check that it is Linear . Select 1, 2 Fandy1, y2 1(yi) so thatL(xi) =yi. Then we haveL 1( 1y1+ 2y2) =L 1( 1L(x1) + 2L(x2)) =L 1(L( 1x1+ 2x2))=IV( 1x1+ 2x2) = 1x1+ 2x2= 1L 1(y1) + 2L 1(y2). isomorphic overF, thenn= we haveL:Fm FnandK:Fn Fmsuch thatLK=IFnandKL= Matnxm(F) andK Matmxn(F). Thusn=tr(IFn) =tr(LK) =tr(KL) =tr(IFm) = thedimensionof a vector spaceVoverFas dimFV=nifVis 1,dimRC= 2,dimQR= .The set of all Linear maps{L:V W}overFishomomorphism, and is denotedby homF(V, W). finite dimensional vector spaces overF, thenhomF(V, W)is also finite dimensional anddimFhomF(V, W) = (dimFW) (dimFV) choosing bases forVandWthere is a natural mappinghomF(V, W) Mat(dimFW) (dimFV)(F)'F(dimFW) (dimFV)This map is both 1-1 and onto as the matrix represetation uniquely determines thelinear map and every matrix yields a Linear AlgebraIgor Yanovsky, Matrix Representations ReduxL:V W, basesx1.

4 , xmforVandy1, .. , ynforW. The matrix forLinterpretedas a Linear map is [L] :Fm Fn. The basis isomorphisms defined by the choices ofbasis forVandW:[x1 xm] :Fm V,1[y1 yn] :Fn W[x1 xm]x x [y1 yn]Fm[L] FnL [x1 xm] = [y1 yn][L] SubspacesA nonempty subsetM Vis asubspaceif , Fandx, y M, then x+ y , 0 , N Vare subspaces, then we can form two new subspaces, thesumand theintersection:M+N={x+y:x M, y N},M N={x:x M, x N}.MandNhavetrivial intersectionifM N={0}.MandNaretransversalifM+N= spaces arecomplementaryif they are transversal and have trivial , Nform adirect sumofVifM N={0}andM+N=V. WriteV=M {(x,0) :x R},x-axis, andN={(0, y) :y R}, {(x,0) :x R},x-axis, andN={(y, y) :y R}, (x, y) = (x y,0) + (y, y), which givesV=M we have a direct sum decompositionV=M N, then we can construct theprojectionofVontoMalongN. The mapE:V Vis defined using that eachz=x+y,x M,y Nand (z) =E(x+y) =E(x) +E(y) =E(x) =x. Thusim(E) =Mand ker(E) = a vector space, aprojectionofVis a Linear operatorEonVsuch thatE2= [x1 xm] :Fm Vmeans[x1 xm] m = 1x1+ + mxmLinear AlgebraIgor Yanovsky, Linear Maps and SubspacesL:V Wis a Linear map (L) =N(L) ={x V:L(x) = 0}TheimageorrangeofLisim(L) =R(L) =L(V) ={L(x) W:x V} (L)is a subspace ofVandim(L)is a subspace that 1, 2 Fand thatx1, x2 ker(L), thenL( 1x1+ 2x2) = 1L(x1) + 2L(x2) = 0 1x1+ 2x2 ker(L).

5 Assume 1, 2 Fandx1, x2 V, then 1L(x1) + 2L(x2) =L( 1x1+ 2x2) im(L). is 1-1 ker(L) ={0}.Proof. We know thatL(0 0) = 0 L(0) = 0, so ifLis 1 1 we haveL(x) = 0 =L(0)implies thatx= 0. Hence ker(L) ={0}. Assume that ker(L) ={0}. IfL(x1) =L(x2), then linearity ofLtells thatL(x1 x2) = 0. Then ker(L) ={0}impliesx1 x2= 0, which shows thatx1= :V W, anddimV= 1-1 Lis onto dimim(L) = the dimension formula, we havedimV= dim ker(L) + dimim(L).Lis 1-1 ker(L) ={0} dim ker(L) = 0 dimim(L) = dimV dimim(L) =dimW im(L) =W, that is,Lis Dimension finite dimensional andL:V Wa Linear map, all overF, thenim(L)is finite dimensional anddimFV= dimFker(L) + dimFim(L) know that dim ker(L) dimVand that it has a complementMof dimensionk= dimV dim ker(L). SinceM ker(L) ={0}the Linear mapLmust be 1-1 whenrestricted toM. ThusL|M:M im(L) is an isomorphism, dimim(L) = dimM= Matrix CalculationsChange of Basis the two basis ofR2, 1={x1= (1,1), x2= (1,0)}and 2={y1= (4,3), y2= (3,2)}, we find the change-of-basis matrixPfrom 1to a Linear combination ofx1andx2,y1=ax1+bx2.

6 (4,3) =a(1,1)+b(1,0) a= 3, b= 1 y1= 3x1+ a Linear combination ofx1andx2,y2=ax1+bx2.(3,2) =a(1,1)+b(1,0) a= 2, b= 1 y2= 2x1+ the coordinates ofy1andy2as columns [3 21 1]. Linear AlgebraIgor Yanovsky, a Linear operator on the finite-dimensional there is a basis forVconsisting of eigenvectors , .. , vnbe nonzero eigenvectors ofdistincteigenvalues 1, .. , {v1, .. , vn}islinearly 1, .. , n, thenLisdiagonalizable. (Proof is in the exercises). a Linear operator on a finite-dimensional vector spaceV, and let be an eigenvalue ofL. DefineE ={x V:L(x) = x}= ker(L IV). The setE is called theeigenspaceofLcorresponding to the eigenvalue .Thealgebraic multiplicityis defined to be the multiplicity of as a root of thecharacteristic polynomial ofL, while thegeometric multiplicityof is defined to bethe dimension of its eigenspace,dimE = dim(ker(L IV)). Also,dim(ker(L IV)) vectorvwith(A I)v= 0is an eigenvector for .Generalized be an eigenvalue ofAwith algebraic vectorvwith(A I)mv= 0is a generalised eigenvector for.

7 2 Inner Product Inner ProductsThe three important properties for complex inner products are:1) (x|x) =||x||2>0 unlessx= ) (x|y) =(y|x).3) For eachy Vthe mapx (x|y) is inner product onCnis defined by(x|y) =xt yConsequences: ( 1x1+ 2x2|y) = 1(x1|y) + 2(x2|y),(x| 1y1+ 2y2) = 1(x|y1) + 2(x|y2),( x| x) = (x|x) =| |2(x|x). Orthonormal , .. , enbe orthonormal. Thene1, .. , enare linearly independent andany elementx span{e1, .. , en}has the expansionx= (x|e1)e1+ + (x|en) that ifx= 1e1+ + nen, then(x|ei) = ( 1e1+ + nen|ei) = 1(e1|ei)+ + n(en|ei) = 1 1i+ + n ni= AlgebraIgor Yanovsky, Gram-Schmidt procedureGiven a linearly independent setx1, .. , xmin an inner product spaceVit is possi-ble to construct an orthonormal collectione1, .. , emsuch thatspan{x1, .. , xm}=span{e1, .. , em}.e1=x1||x1||.z2=x2 projx1(x2) =x2 proje1(x2) =x2 (x2|e1)e1,e2=z2||z2||.zk+1=xk+1 (xk+1|e1)e1 (xk+1|ek)ek,ek+1=zk+1||zk+1||. QR FactorizationA=[x1 xm]=[e1 em] (x1|e1) (x2|e1) (xm|e1)0(x2|e2) (xm|e2).

8 00 (xm|em) = the vectorsx1= (1,1,0), x2= (1,0,1), x3= (0,1,1) Gram-Schmidt:e1=x1||x1||=(1,1,0) 2=(1 2,1 2,0).z2= (1,0,1) 1 2(1 2,1 2,0)=(12, 12,1),e2=z2||z2||=(12, 12,1) 3/2=(1 6, 1 6,2 6).z3=x3 (x3|e1)e1 (x3|e2)e2= (0,1,1) 1 2(1 2,1 2,0) 1 6(1 6, 1 6,2 6)=( 1 3,1 3,1 3),e3=z3||z3||=( 1 3,1 3,1 3). Orthogonal Complements and ProjectionsTheorthogonal projectionof a vectorxonto a nonzero vectoryis defined byprojy(x) =(x y||y||)y||y||=(x|y)(y|y)y,(The length of this projection is|projy(x)|=||(x|y)||||y||).The definition ofprojy(x) immediately implies that it is Linear from the linearity of theinner map x projy(x)is a to showprojy(projy(x)) =projy(x).projy(projy(x)) =projy((x|y)(y|y)y)=(x|y)(y|y)projy(y) =(x|y)(y|y)(y|y)(y|y)y=(x|y)(y|y)y=projy (x). Linear AlgebraIgor Yanovsky, 200510 Cauchy-Schwarz inner product space.|(x|y)| ||x||||y||,x, y showprojy(x) x projy(x):(projy(x)|x projy(x)) =((x|y)||y||2y x (x|y)||y||2y)=((x|y)||y||2y x) ((x|y)||y||2y (x|y)||y||2y)=(x|y)||y||2(y|x) (x|y)||y||2(x|y)||y||2(y|y) =(x|y)||y||2(y|x) (x|y)||y||2(x|y) = 0.

9 ||x|| ||projy(x)||= (x|y)(y|y)y = (x|y)(y|y) ||y||=|(x|y)|||y||.Triangle inner product space.||x+y|| ||x||+||y||.Proof.||x+y||2= (x+y|x+y) =||x||2+ 2Re(x|y) +||y||2 ||x||2+ 2|(x|y)|+||y||2 ||x||2+ 2||x||||y||+||y||2= (||x||+||y||) Vbe a finite dimensional subspace of an innter product space, ande1, .. , eman orthonormal basis forM. Using that basis, defineE:V VbyE(x) = (x|e1)e1+ + (x|em)emNote thatE(x) Mand that ifx M, thenE(x) =x. ThusE2(x) =E(x) implyingthatEis a projection whose image isM. Ifx ker(E), then0 =E(x) = (x|e1)e1+ + (x|em)em (x|e1) = = (x|em) = is equivalent to the condition (x|z) = 0 for allz M. The set of all such vectorsis theorthogonal complementtoMinVis denotedM ={x V: (x|z) = 0 for allz M} an inner product space. AssumeV=M M , thenim(projM) =Mandker(projM) =M . IfM Vis finite dimensional thenV=M M andprojM(x) = (x|e1)e1+ + (x|em)emfor any orthonormal basise1, .. , as above, ker(E) =M .x=E(x) + (I E)(x) and (I E)(x) ker(E) =M.

10 Choosez M.||x projM(x)||2 ||x projM(x)||2+||projM(x) z||2=||x z||2,where equality holds when||projM(x) z||2= 0, ,projM(x) is the only closest pointtoxamong the points AlgebraIgor Yanovsky, :V Vbe a projection on toM Vwith the property thatV= ker(E) ker(E) . Then the following conditions are )E= )im(E) = ker(E).3)||E(x)|| ||x||for allx have already seen that (1) (2). Also (1),(2) (3) asx=E(x)+(I E)(x)is an orthogonal decomposition. So||x||2=||E(x)||2+||(I E)(x)||2 ||E(x)|| , we only need to show that (3) implies thatEis orthogonal. Choosex ker(E) and observe thatE(x) =x (1 E)(x) is an orthogonal decomposition. Thus||x||2 ||E(x)||2=||x (1 E)(x)||2=||x||2+||(1 E)(x)||2 ||x||2 This means that (1 E)(x) = 0 and hencex=E(x) im(E) ker(E) im(E).Conversely, ifz im(E) =M, then we can writez=x+y ker(E) ker(E) . Thisimplies thatz=E(z) =E(y) =y, where the last equality follows from ker(E) im(E). This means thatx= 0 and hencez=y ker(E) .3 Linear Maps on Inner Product Adjoint MapsTheadjointofAis the matrixA such thataij= :Fm Fn,A :Fn Fm.


Related search queries