CPSC 536M:Optimization TheoryThe convex geometry of structured optimizationConvex optimization is a key tool for analyzing and solving a range of computational problems that arise in machine learning, statistical signal and image processing, theoretical computer science, and other fields. It’s also the backbone for other areas of optimization, including algorithms for nonconvex problems. This course aims to provide a self-contained introduction to a few of the many geometric and intuitive ideas in convex analysis and their usefulness for developing and understanding computationally-efficient algorithms for a range of scientific and engineering applications.In addition to convexity, a recurring theme in the course is the convex geometry of structured optimization, which involves problems whose solutions exhibit a latent structure, and are assembled using a prescribed set of atoms. As has been recognized in the optimization and related communities for at least the past decade, significant computational efficiencies are made available by exploiting this latent structure, coupled with the overarching structure provided by convexity. The course will emphasize the rich convex geometry that underlies atomic decomposition and its algorithmic manifestations.SyllabusHere is a broad overview of the topics covered.Part 1: Convex setsconvexity-preserving operations, generated sets, atomic setsprojection and separation of setscones and polarityPart 2: Convex functionsepigraphical viewconvexity-preserving operations, infimal convolution, perspective transform, smoothingdifferentiable and nondifferentiable functionssupport functions: correspondence with convex sets; norms and duals; gauges and polarsdual operationsPart 3: Convex optimizationproblem types: linear and quadratic optimization, conic programmingapplications of structured optimization, including compressed sensing, matrix completion, superresolutionLagrange and conjugate dualityfirst-order methods for unconstrained and nonsmooth problems, including subgradient, proximal gradient, and conditional gradient methodsTarget AudienceThis course is intended for students who wish to learn the underpinnings of optimization and are considering research related to this area. Students looking to gain more practical experience with optimization (e.g., how to use various solvers) should instead consider CPSC 406 (Computational Optimization), taught during Term 2.PrerequisitesBackground in vector calculus, numerical linear algebra, and basic real analysis. The course is unapologetically mathematical, and a casual background in these topics will not be sufficient. Consider taking this online self-assesment on numerical linear algebra.GradingHomework assignments. Several assignments involving both theoretical and computational exercises.Course project. A project will be assigned that will require deriving, implementing, and applying an optimization algorithm to solve a problem numerically. A short report (4-6 pages) will be required and will be evaluated for mathematical correctness, quality of the software implementation, and presentation. This will be due in the last week of term.Grade distribution (tentative). 50% homework; 50% class project.Auditors and UndergraduatesAuditors welcome. Graduate students who wish to audit should file a graduate registration form with the CS front office. Undergraduate students who wish to take the course for credit should request permission from the instructor and file an undergraduate registration form to the CS front office.ReferencesThese references should be helpful to supplement course notes:Bertsekas, D. P., Convex optimization theory. Athena Scientific, 2009Boyd, S., and Vandenberghe, L. Convex optimization. Cambridge University Press, 2004Hiriart-Urruty, J.-B., and Lemaréchal, C., Fundamentals of convex analysis. Springer, 2012Rockafellar, R. T., Convex analysis. Princeton University Press, 2015Rockafellar, R. T., and Wets, R. J-B., Variational analysis. Springer, 2009PiazzaSign up for the Piazza course page.