Example: tourism industry

A Singularly Valuable Decomposition: The SVD of a Matrix

A Singularly Valuable decomposition : The SVD of a MatrixDan KalmanThe American UniversityWashington, DC 20016 February 13, 2002 Every teacher of linear algebra should be familiar with the matrixsingular value decomposition (orSVD). It has interesting and attractive algebraic properties, and conveys important geometrical andtheoretical insights about linear transformations. The close connection between the SVD and the wellknown theory of diagonalization for symmetric matrices makes the topic immediately accessible to linearalgebra teachers, and indeed, a natural extension of what these teachers already know. At the sametime, the SVD has fundamental importance in several di erent applications of linear algebra. Strangwas aware of these facts when he introduced the SVD in his now classical text [22, page 142], observing:::it is not nearly as famous as it should and Van Loan ascribe a central signi cance to the SVD in their de nitive explication of numericalmatrix methods [8, page xiv] stating:::perhaps the most recurring theme in the book is the practical and theoretical value of [theSVD].

Theory The SVD is intimately related to the familiar theory of diagonalizing a symmetric matrix. Recall that if Ais a symmetric real n£nmatrix, there is an orthogonal matrix V and a diagonal Dsuch that A= VDVT.Here the columns of V are eigenvectors for Aand form an orthonormal basis for Rn; the diagonal entries of Dare the eigenvalues of A.To emphasize the connection with …

Tags:

  Decomposition, Diagonal

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of A Singularly Valuable Decomposition: The SVD of a Matrix

1 A Singularly Valuable decomposition : The SVD of a MatrixDan KalmanThe American UniversityWashington, DC 20016 February 13, 2002 Every teacher of linear algebra should be familiar with the matrixsingular value decomposition (orSVD). It has interesting and attractive algebraic properties, and conveys important geometrical andtheoretical insights about linear transformations. The close connection between the SVD and the wellknown theory of diagonalization for symmetric matrices makes the topic immediately accessible to linearalgebra teachers, and indeed, a natural extension of what these teachers already know. At the sametime, the SVD has fundamental importance in several di erent applications of linear algebra. Strangwas aware of these facts when he introduced the SVD in his now classical text [22, page 142], observing:::it is not nearly as famous as it should and Van Loan ascribe a central signi cance to the SVD in their de nitive explication of numericalmatrix methods [8, page xiv] stating:::perhaps the most recurring theme in the book is the practical and theoretical value of [theSVD].

2 Additional evidence of the signi cance of the SVD is its central role in a number of papers in recentyears inMathematics MagazineandThe American Mathematical Monthly(for example [2, 3, 17, 23]).Although it is probably not feasible to include the SVD in the rst linear algebra course, it de nitelydeserves a place in more advanced undergraduate courses, particularly those with a numerical or appliedemphasis. My primary goals in this article are to bring the topic to the attention of a broad audience,and to reveal some of the facets that give it both practical and theoretical signi TheoryThe SVD is intimately related to the familiar theory of diagonalizing a symmetric Matrix . Recall thatifAis a symmetric realn nmatrix, there is an orthogonal matrixVand a diagonalDsuch thatA=VDVT.

3 Here the columns ofVare eigenvectors forAand form an orthonormal basis forRn; thediagonal entries ofDare the eigenvalues ofA. To emphasize the connection with the SVD, we will refertoVDVTas the eigenvalue decomposition , orEVD, the SVD we begin with an arbitrary realm nmatrixA:As we shall see, there are orthogonalmatricesUandVand a diagonal Matrix , this time denoted ;such thatA=U VT:In this case,Uism mandVisn n, so that is rectangular with the same dimensions asA. The diagonal entriesof ;that is the ii= i, can be arranged to be nonnegative and in order of decreasing positive ones are called thesingular valuesofA. The columns ofUandVare called left and rightsingular vectors, analogy between the EVD for a symmetric Matrix and SVD for an arbitrary Matrix can beextended a little by thinking of matrices as linear transformations.

4 For a symmetric matrixA;thetransformation takesRnto itself, and the columns ofVde ne an especially nice basis. When vectorsare expressed relative to this basis, we see that the transformation simply dilates some components andcontracts others, according to the magnitudes of the eigenvalues (with a re ection through the origintossed in for negative eigenvalues). Moreover, the basis is orthonormal, which is the best kind of basisto let's look at the SVD for anm nmatrixA. Here the transformation takesRnto a di erentspace,Rm, so it is reasonable to ask for a natural basis for each of domain and range. The columnsofVandUprovide these bases. When they are used to represent vectors in the domain and rangeof the transformation, the nature of the transformation again becomes transparent: it simply dilatessome components and contracts others, according to the magnitudes of the singular values, and possiblydiscards components or appends zeros as needed to account for a change in dimension.

5 From thisperspective, the SVD tells us how to choose orthonormal bases so that the transformation is representedby a Matrix with the simplest possible form, that is, do we choose the basesfv1;v2; ;vngandfu1;u2; ;umgfor the domain and range? Thereis no di culty in obtaining a diagonal representation. For that, we need onlyAvi= iui, which is easilyarranged. Select an orthonormal basisfv1;v2; ;vngforRnso that the rstkelements span the rowspace ofAand the remainingn kelements span the null space ofA;wherekis the rank ofA:Then for1 i kde neuito be a unit vector parallel toAvi, and extend this to a basis forRm. Relative tothese bases,Awill have a diagonal representation. But in general, although thev's are orthogonal, thereis no reason to expect theu's to be.

6 The possibility of choosing thev{basis so that its orthogonality ispreserved underAis the key point. We show next that the EVD of then nsymmetric matrixATAprovides just such a basis, namely, the eigenvectors ofATA:LetATA=VDVT, with the diagonal entries iofDarranged in nonincreasing order, and let the2 columns ofV(which are eigenvectors ofATA) be the orthonormal basisfv1;v2; ;vng. ThenAvi Avj=(Avi)T(Avj)=vTiATAvj=vTi( jvj)= jvi vjso the image setfAv1;Av2; ;Avngis orthogonal, and the nonzero vectors in this set form a basisfor the range ofA. Thus, the eigenvectors ofATAand their images underAprovide orthogonal basesallowingAto be expressed in a diagonal complete the construction, we normalize the vectorsAvi. The eigenvalues ofATAagain appearin this step. Takingi=jin the calculation above givesjAvij2= i, which means i 0:Since theseeigenvalues were assumed to be arranged in nonincreasing order, we conclude that 1 2 k 0 and, since the rank ofAequalsk; i= 0 fori>k:The orthonormal basis for the range is thereforede ned byui=AvijAvij=1p iAvi;1 i kIfk<m;we extend this to an orthonormal basis completes the construction of the desired orthonormal bases forRnandRm.}

7 Setting i=p iwe haveAvi= iuifor alli k:Assembling thevias the columns of a matrixVand theuito formU;this shows thatAV=U ;where has the same dimensions asA;has the entries ialong themain diagonal , and has all other entries equal to zero. Hence,A=U VT;which is the singular valuedecomposition ofA:In summary, anm nreal matrixAcan be expressed as the productU VT, whereVandUare orthogonal matrices and is a diagonal Matrix , as follows. The matrixVis obtained from thediagonal factorizationATA=VDVT;in which the diagonal entries ofDappear in non-increasingorder; the columns ofUcome from normalizing the nonvanishing images underAof the columns ofV;and extending (if necessary) to an orthonormal basis forRm; the nonzero entries of are the respectivesquare roots of corresponding diagonal entries ofD:The preceding construction demonstrates that the SVD exists, and gives some idea of what it tellsabout a Matrix .

8 There are a number of additional algebraic and geometric insights about the SVD thatare derived with equal ease. Before proceeding to these insights, two remarks should be made. First,the SVD encapsulates the most appropriate bases for the domain and range of the linear transformationde ned by the matrixA. There is a beautiful relationship between these bases and the four fundamentalsubspaces associated withA: the range and nullspace, and their orthogonal complements. It is the fullpicture provided by the SVD and these subspaces that Strang has termed the fundamental theorem oflinear algebra. He also invented a diagram schematically illustrating the relationship of the bases andthe four subspaces (see Fig. (1)). Strang's article [23] is recommended for a detailed discussion of second remark concerns computation.

9 There is often a gap between mathematical theory andcomputational practice. In theory, we envision arithmetic operations being carried out on real numbersin in nite precision. But when we carry out arithmetic on a digital computer we compute not with reals,but with a nite set of rationals, and the results can only approximate with limited precision the real3v1 .. vkvk+1 .. vnu1 .. ukuk+1 .. umAnull(A)null(A) range(A)range(A) Figure 1: Strang's Diagramcomputations. Procedures that seem elegant and direct in theory can sometimes be dramatically awedas prescriptions for computational algorithms. The SVD illustrates this point. Our construction appearsto o er a straightforward algorithm for the SVD: formATA, compute its eigenvalues and eigenvectors,and then nd the SVD as described above.

10 Here practice and theory go their separate ways. As weshall see later, the computation ofATAcan be subject to a serious loss of precision. It turns out thatdirect methods exist for nding the SVD ofAwithout formingATA, and indeed in many applicationsthe practical importance of the SVD is that it allows one to nd the EVD ofATAwithout ever formingthis numerically treacherous us now proceed to several additional aspects of the SVD and construction showed how to determine the SVD of A from the EVDof the symmetric matrixATA. Conversely, it is easy to recover the EVD ofATAfrom the SVD ofA. For suppose the singular value decompositionA=U VTis given. Clearly,ATA=V T VTand4 AAT=U TUT. Now in either order the product of and Tis a square diagonal Matrix whose rst kdiagonal entries are the 2i, and with any remaining diagonal entries equal to 0.


Related search queries