Course Outline - CMPT 409 - Spec.Topics/Theoretical Cmpt
Information
Subject
Catalog Number
Section
Semester
Title
Instructor(s)
Campus
CMPT
409
D200
2022 Fall (1227)
Spec.Topics/Theoretical Cmpt
Sharan Vaswani
Burnaby Mountain Campus
Calendar Objective/Description
Spec.Topics/Theoretical Cmpt
Instructor's Objectives
This course (Optimization for Machine Learning) introduces the foundational concepts of convex and non-convex optimization with applications to machine learning. It will give the students experience in 1. Proving theoretical guarantees for optimization algorithms, 2. Analyzing machine learning (ML) problems from an optimization perspective and 3. Developing and analyzing new optimization methods for ML applications.
Prerequisites
see go.sfu.ca
Topics
- Basics: Subdifferentials, Optimality conditions, Conjugates, Lipschitz continuity, Convexity
- Machine Learning Basics: Linear/Logistic regression, Kernel methods, Deep learning
- (Non)-Convex minimization 1: (Projected/Proximal) Gradient Descent, Nesterov/Polyak momentum
- (Non)-Convex minimization 2: Mirror Descent, Newton/Quasi-Newton/Gauss-Newton method
- (Non)-Convex minimization 3: Stochastic gradient descent (SGD), Variance reduction techniques
- (Non)-Convex minimization 4: Adaptivity for SGD, Coordinate Descent
- Applications to training ML models (logistic regression, kernel machines, neural networks)
- Online optimization 1: Regret minimization, Online to Batch, Follow the (regularized) leader
- Online optimization 2: Optimistic Gradient Descent , Adaptive gradient methods (AdaGrad, Adam)
- Applications to Imitation learning, Reinforcement learning
- Min-Max optimization 1: Primal-dual methods, (Stochastic) Gradient Descent-Ascent, Proximal point
- Min-Max optimization 2: (Stochastic) Extragradient, Acceleration, Variance reduction
- Applications to GANs, Robust optimization, Multi-agent RL
Grading
There will be a couple of assignments with the major evaluation components being a paper presentation and a final project. The details will be discussed in the first week of classes.
Reference Books
- Convex Optimization, Boyd and Vandenberghe, 2004, 9780521833783
- Numerical Optimization, Nocedal and Wright, 2006, 9780387303031
- First-order Methods in Optimization, Beck, 2017, 9781611974980
- Convex Optimization: Algorithms and Complexity, Bubeck, 2014, 9781601988607
- Lectures on Convex Optimization, Nesterov, 2018, 9783319915777
Academic Honesty Statement
Academic honesty plays a key role in our efforts to maintain a high standard of academic excellence and integrity. Students are advised that ALL acts of intellectual dishonesty will be handled in accordance with the SFU Academic Honesty and Student Conduct Policies ( http://www.sfu.ca/policies/gazette/student.html ).