This book, developed through class instruction at MIT over the last 15 years, provides an accessible, concise, and intuitive presentation of algorithms for solving convex optimization problems. It relies on rigorous mathematical analysis, but also aims at an intuitive exposition that makes use of visualization where possible. This is facilitated by the extensive use of analytical and algorithmic concepts of duality, which by nature lend themselves to geometrical interpretation. The book places particular emphasis on modern developments, and their widespread applications in fields such as large-scale resource allocation problems, signal processing, and machine learning.
Among its features, the book:
* Develops comprehensively the theory of descent and approximation methods, including gradient and subgradient projection methods, cutting plane and simplicial decomposition methods, and proximal methods
* Describes and analyzes augmented Lagrangian methods, and alternating direction methods of multipliers
* Develops the modern theory of coordinate descent methods, including distributed asynchronous convergence analysis
* Includes optimal algorithms based on extrapolation techniques, and associated rate of convergence analysis
* Describes a broad variety of applications of large-scale optimization and machine learning
* Contains many examples, illustrations, and exercises
* Is structured to be used conveniently either as a standalone text for a class on convex analysis and optimization, or as a theoretical supplement to either an applications/convex optimization models class or a nonlinear programming class