**Instructor**: Dabeen Lee (E2-2 #2109)

**Office Hours**: Tue 2:00 - 3:00 pm

**Email**: dabeenl [at] kaist [dot] ac [dot] kr

**Teaching Assistant**: Haeun Jeon (Email: haeun39 [at] kaist [dot] ac [dot] kr)

**Text**: No required text, but the following materials are helpful.

__Recommended textbooks and lecture notes__

- Convex Optimization by Boyd and Vandenberghe.
- Convex Optimization: Algorithms and Complexity by Bubeck.
- Lectures on Modern Convex Optimization by Ben-Tal and Nemirovski.

**Syllabus** (pdf)

Big data has introduced many opportunities to make better decision-making based on a data-driven approach, and many of the relevant decision-making problems can be posed as optimization models that have special properties such as convexity and smoothness. From this course, a graduate-level student will learn fundamental and comprehensive convex optimization knowledge in theory (convex analysis, optimality conditions, duality) and algorithms (gradient descent and variants, Frank-Wolfe, and proximal methods). We will also cover some application areas including statistical estimation, finance (e.g., portfolio optimization), machine learning, and data science.

- Mon 8/28: introduction (slides), linear algebra review (note)
- Wed 8/30: matrix calculus review, convex sets (note)
- Mon 9/04: convex functions, first-order and second-order characterizations of convex functions (note)
- Wed 9/06: operations preserving convexity, convex optimization problems I (Portfolio optimization, Uncertainty quantification) (note)
- Mon 9/11: convex optimization problems II (SVM, LASSO, Facility location), classes of convex programming I (LP) (note)
- Wed 9/13: classes of convex programming II (QP, SDP, Conic programming) (note)
- Mon 9/18: conic duality, SOCP and applications (note)
- Wed 9/20: optimality conditions, introduction to gradient descent (note)
- Mon 9/25: convergence of gradient descent (note)
- Wed 9/27: subgradient method, gradient descent for smooth functions (note)
- Thu 10/05: convergence of gradient descent for functions that are smooth and strongly convex (note)
- Wed 10/11: projected gradient descent, Nesterov's acceleration, oracle complexity lower bounds (note)
- Mon 10/23: Frank-Wolfe algorithm, introduction to online convex optimization (note)
- Wed 10/25: online and stochastic gradient descent algorithms (note)
- Mon 10/30: convergence of stochastic gradient descent, proximal gradient descent (note)
- Wed 11/01: convergence of proximal gradient descent, proximal point algorithm (note)
- Mon 11/06: KKT conditions, Lagrangian duality (note)
- Wed 11/08: saddle point problem, Fenchel conjugate (note)
- Mon 11/13: Fenchel duality (note)
- Wed 11/15: dual gradient method, Moreau-Yosida smoothing (note)
- Mon 11/20: augmented Lagrangian method, alternating direction method of multipliers (ADMM) (note)
- Wed 11/22: Newton's method (note)
- Mon 11/27: quasi-Newton methods (note)
- Mon 12/04: barrier method (note)
- Wed 12/06: primal-dual interior point method (note)