**Instructor**: Dabeen Lee (E2-2 #2109)

**Office Hours**: Wed 2:00 - 4:00 pm

**Email**: dabeenl [at] kaist [dot] ac [dot] kr

**Teaching Assistant**: Haeun Jeon (Email: haeun39 [at] kaist [dot] ac [dot] kr)

**Text**: No required text, but the following materials are helpful.

__Recommended textbooks and lecture notes__

- Convex Optimization by Boyd and Vandenberghe.
- Convex Optimization: Algorithms and Complexity by Bubeck.
- Lectures on Modern Convex Optimization by Ben-Tal and Nemirovski.

**Syllabus** (pdf)

Big data has introduced many opportunities to make better decision-making based on a data-driven approach, and many of the relevant decision-making problems can be posed as optimization models that have special properties such as convexity and smoothness. From this course, a graduate-level student will learn fundamental and comprehensive convex optimization knowledge in theory (convex analysis, optimality conditions, duality) and algorithms (gradient descent and variants, Frank-Wolfe, and proximal methods). We will also cover some application areas including statistical estimation, finance (e.g., portfolio optimization), machine learning, and data science.

- - Classroom change: from #1122 to #1501
- - Hybrid class on September 6
- - Hybrid class on September 8
- - No class on October 4 (make-up day for cancelled classes)
- - No class on Octoboer 20 (midterm week)
- - No class on December 1 (undergraduate entrance exam)
- - Take-home final exam on December 13 (due: 11:59 pm on December 14)

- Tue 8/30: Introduction (slides), Review of linear algebra basics I (lecture note)
- Thu 9/01: Review of linear algebra basics II, Multivariate calculus review, Convex sets (lecture note)
- Tue 9/06: Convex functions, First-order and Second-order characterizations (lecture note)
- Thu 9/08: Operations preserving convexity, Optimization terminologies, Convex optimization applications I (Portfolio optimization, Uncertainty quantification, Support vector machine) (lecture note)
- Tue 9/13: Convex optimization applications II (LASSO, Facility location), Convex optimization classes I (LP, Convex QP, SDP) (lecture note)
- Thu 9/15: Convex optimization classes II (Conic programming, SOCP), Conic duality (lecture note)
- Tue 9/20: Optimality conditions for convex minimization, Descent method (lecture note)
- Thu 9/22: Introduction to gradient descent, Convergence result for smooth functions (lecture note)
- Tue 9/27: Convergence of gradient descent, Subgradients, Optimality conditions for non-differentiable functions, Subgradient method (lecture note)
- Thu 9/29: Smoothness, Strong convexity, Convergence of gradient descent for smooth functions (lecture note)
- Thu 10/06: More properties of smooth and strongly convex functions, Convergence of gradient descent for smooth functions that are strongly convex (lecture note)
- Tue 10/11: Projected gradient descent, Conditional gradient descent (Frank-Wolfe algorithm), Lower bounds on the iteration complexity of gradient methods (lecture note)
- Thu 10/13: Accelerated gradient descent (Nesterov's acceleration), Online convex optimization (lecture note)
- Tue 10/18: Online binary classification, Stochastic optimization via online convex optimization, Stochastic gradient descent (lecture note)
- Tue 10/25: Analysis of stochastic gradient descent, Mini-batch SGD, Variance reduction (lecture note)
- Tue 11/01: Variance-reduced stochastic methods, SVRG (lecture note)
- Thu 11/03: Proximal operator, Proximal gradient descent (lecture note)
- Tue 11/08: KKT conditions, Lagrangian duality (lecture note)
- Thu 11/10: Saddle point problem, Fenchel Conjugate (lecture note)
- Tue 11/15: Fenchel duality (lecture note)
- Thu 11/17: Dual gradient ascent, Proximal point algorithm (lecture note)
- Tue 11/22: Moreau-Yosida smoothing, Augmented Lagrangian method (lecture note)
- Thu 11/24: Dual proximal gradient, ADMM, Introduction to Newton's method (lecture note)
- Tue 11/29: Convergence of Newton's method (lecture note)
- Tue 12/06: Quasi-Newton method (lecture note)