Example: barber

An introduction to optimization on smooth manifolds

N I C O L A S B O U M A LA N I N T R O D U C T I O N TOO P T I M I Z AT I O N O NS M O OT H M A N I F O L D ST H I S V E R S I O N C O M P I L E D : S E P T E M B E R 1 1 , 2 0 2 0A much more recent version is available at the following address: network localization from extreme eigenvalue or singular component of matrix mixture semidefinite programs253 Embedded geometry: first submanifolds of Euclidean maps on embedded differential of a smooth fields and the tangent on a manifold: manifolds and frames* and references5444 First-order optimization first-order Taylor expansion on optimality gradient conditions and iteration convergence* checking a and references795 Embedded geometry: second case for another derivative of vector look at differentials of vector fields in linear vector

2.5 Principal component analysis 19 2.6 Synchronization of rotations 22 2.7 Low-rank matrix completion 23 2.8 Gaussian mixture models 24 2.9 Smooth semidefinite programs 25 3 Embedded geometry: first order 27 3.1 Euclidean space 30 3.2 Embedded submanifolds of Euclidean space 33 3.3 Smooth maps on embedded submanifolds 40 3.4 The differential ...

Tags:

  Analysis, Smooth, Introduction, Principal component analysis, Principal, Component, Manifolds, Optimization, An introduction to optimization on smooth manifolds

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of An introduction to optimization on smooth manifolds

1 N I C O L A S B O U M A LA N I N T R O D U C T I O N TOO P T I M I Z AT I O N O NS M O OT H M A N I F O L D ST H I S V E R S I O N C O M P I L E D : S E P T E M B E R 1 1 , 2 0 2 0A much more recent version is available at the following address: network localization from extreme eigenvalue or singular component of matrix mixture semidefinite programs253 Embedded geometry: first submanifolds of Euclidean maps on embedded differential of a smooth fields and the tangent on a manifold: manifolds and frames* and references5444 First-order optimization first-order Taylor expansion on optimality gradient conditions and iteration convergence* checking a and references795 Embedded geometry: second case for another derivative of vector look at differentials of vector fields in linear vector fields on manifolds : as pointwise derivatives* vector fields on and second-order Taylor expansion on and references1086 Second-order optimization optimality Newton s Newton steps.

2 Conjugate trust trust-region subproblem: truncated checking a and references14157 Embedded submanifolds: spaces as unit sphere in a Euclidean Stiefel manifold: orthonormal orthogonal group and rotation hyperboloid defined by h(x) = and references1668 General permissive atlas topology, and a final submanifolds are vectors and tangent of smooth bundles and vector vector fields as local metrics and brackets as vector connections and derivatives, velocity and expansions and second-order embedded in and references1959 Quotient definition and a few manifolds through group maps to and from quotient , vertical and horizontal quotient word about Riemannian gradient word about Riemannian Newton s space embedded in a linear curves and covariant.

3 Geodesics and second-order and references23510 Additional , geodesics and and logarithmic conditions and Taylor difference approximation of the fields and their covariant and references27811 Geodesic sets and convex sets and geodesically convex reals and geometric definite and references298 Bibliography301 PrefaceThis book grew out of the lecture notes for MAT APC588,Topics in Nu-merical analysis : optimization on smooth manifolds , taught in the springsemesters of2019and2020at Princeton the last few decades, optimization on manifolds has garneredincreasing interest in research and engineering, recognized as a wide,beautiful and effective generalization of unconstrained , this ascension was accelerated by the Absil, R.

4 Mahony, and R. Algorithms on University Press,Princeton, NJ,2008of the de facto reference book on the matter, optimization algorithmson Riemannian manifoldsby Pierre-Antoine Absil, Robert Mahony andRodolphe problems on smooth manifolds arise often in engi-neering applications, including in machine learning, computer vision,signal processing, dynamical systems and scientific computing. Yet,engineering programs seldom include training in differential geome-try. Furthermore, existing textbooks on this topic usually have a focusmore aligned with the interests of mathematicians than with the needsof engineers and applied of my goals in writing this book is to offer a different, if attimes unorthodox, introduction to differential geometry.

5 Definitionsand tools are introduced in a need-based order for optimization . Westart with a restricted setting that of embedded submanifolds of lin-ear spaces which allows us to define all necessary concepts in directreference to their usual counterparts from linear spaces. This coversmost what is perhaps the clearest departure from standard exposition,charts and atlases are not introduced until quite late. The reason fordoing so is twofold: pedagogically, charts and atlases are more ab-stract than what is needed to work on embedded submanifolds; andpragmatically, charts are seldom if ever useful in practice.

6 It would beunfortunate to give them center course, charts and atlases are the right tool to provide a uni-fied treatment of all smooth manifolds in an intrinsic way. They areintroduced eventually, at which point it becomes possible to discussquotient manifolds : a powerful language to understand symmetry in8optimization. Perhaps this abstraction is necessary to fully appreciatethe depth of optimization on manifolds as more than just a fancy toolfor constrained optimization in linear spaces, and truly a mathemati-cally natural setting forunconstrainedoptimization in a wider are assumed to be comfortable with linear algebra and mul-tivariable calculus.

7 For important computational aspects, it is helpfulto have notions of numerical linear algebra, for which I recommend theapproachable textbook by Trefethen and to the Trefethen and D. algebra. Society for Industrial andApplied Mathematics,1997d tre of this book, there are no prerequisites in differential geometryor on these expectations, the aim is to give full proofs andjustification for all concepts that are introduced, at least for the case ofsubmanifolds of linear spaces. The hope is to equip readers to pursueresearch projects in (or using) optimization on manifolds , involvingboth mathematical analysis and efficient the exposition is original in several ways, and some resultsare hard to locate in the literature, few results are new.

8 The main dif-ferential geometry references I used are the fantastic books by Lee,3, to smooth Man-ifolds, volume218ofGraduate Texts inMathematics. Springer-Verlag New York,2nd edition, to RiemannianManifolds, volume176ofGraduate Textsin ,2nd edition,2018O Neill,5and Brickell and O geometry:with applications to relativity, Press,19836F. Brickell and : an introduction . Van NostrandReinhold,1970A number of people offered decisive comments. I thank Pierre-Antoine Absil (who taught me most of this) and Rodolphe Sepulchrefor their input at the early stages of planning for this book, as well as(in no particular order) Stephen McKeown, Eitan Levin, Chris Crisci-tiello, Razvan-Octavian Radu, Joe Kileel, Bart Vandereycken, BamdevMishra, Suvrit Sra and S ndor Z.

9 N meth for numerous conversationsthat led to direct improvements. I am also indebted to the mathemat-ics department at Princeton University for supporting me while I waswriting. On a distinct note, I also thank Bil Kleb, Bill Wood and KevinGodby for developing and freely distributing tufte-latex: the style filesused book is still evolving. Your input on all aspects, however majoror minor, is immensely welcome. Please feel free to contact BoumalPrinceton, New is a staple of mathematical modeling. In this rich frame-work, we consider a setScalled thesearch space it contains all possibleanswers to our problem, good and bad and acost function f:S Rwhich associates a costf(x)to each elementxofS.

10 The goal is to findx Ssuch thatf(x)is as small as possible: a best answer. We writeminx Sf(x)to represent both the optimization problem and the minimal cost (if itexists). Occasionally, we wish to denote specifically the subset ofSforwhich the minimal cost is attained; the standard notation is:arg minx Sf(x),bearing in mind that this set might be empty. In Chapter2, we discussa few simple examples relevant to our , optimization problems admit an analytical solution. Typi-cally, we need numerical algorithms to (try to) solve them. Often, thebest algorithms exploit mathematical structure important special case arises whenSis a linear space such asRn.


Related search queries