IE 539: Convex Optimization

IE 539: Convex Optimization, Fall 2024

MW 2:30 - 3:40 pm, E2-2 B105

Instructor: Dabeen Lee (E2-2 #2109)
Office Hours: Tue 2:00 - 3:00 pm
Email: dabeenl [at] kaist [dot] ac [dot] kr

Teaching Assistant: Jaehyun Park (Email: jhpark [at] kaist [dot] ac [dot] kr) and Junyeop Kwon (Email: junyeopk [at] kaist [dot] ac [dot] kr)

Text: No required text, but the following materials are helpful.

Recommended textbooks and lecture notes

Syllabus (pdf)

Big data has introduced many opportunities to make better decision-making based on a data-driven approach, and many of the relevant decision-making problems can be posed as optimization models that have special properties such as convexity and smoothness. From this course, a graduate-level student will learn fundamental and comprehensive convex optimization knowledge in theory (convex analysis, optimality conditions, duality) and algorithms (gradient descent and variants, Frank-Wolfe, and proximal methods). We will also cover some application areas including statistical estimation, finance (e.g., portfolio optimization), machine learning, and data science.

Lecture notes

  1. Mon 9/02: introduction (slides), linear algebra review (note)
  2. Wed 9/04: matrix calculus review, convex sets (note)
  3. Mon 9/09: convex functions, first-order and second-order characterizations of convex functions (note)
  4. Wed 9/11: operations preserving convexity, convex optimization problems I (Portfolio optimization, Uncertainty quantification) (note)
  5. Mon 9/23: convex optimization problems II (SVM, LASSO, Facility location), classes of convex programming I (LP) (note)
  6. Wed 9/25: classes of convex programming II (QP, SDP, Conic programming) (note)
  7. Mon 9/30: conic duality, SOCP and applications (note)
  8. Wed 10/02: optimality conditions, introduction to gradient descent (note)
  9. Mon 10/07: convergence of gradient descent (note)
  10. Mon 10/14: subgradient method, gradient descent for smooth functions, projected gradient descent (note)
  11. Wed 10/16: convergence of gradient descent for functions that are smooth and strongly convex (note)
  12. Mon 10/28: oracle complexity lower bounds, Nesterov's acceleration, Frank-Wolfe algorithm (note)
  13. Wed 10/30: stochastic gradient descent (note)
  14. Mon 11/04: proximal gradient descent (note)
  15. Wed 11/06: ISTA and FISTA for LASSO (note)
  16. Mon 11/11: proximal point algorithm, Lagrangian duality (note)
  17. Wed 11/13: saddle point problem, Fenchel duality I (note)
  18. Mon 11/18: Fenchel duality II (note)
  19. Wed 11/20: dual gradient method, Moreau-Yosida smoothing (note)
  20. Mon 11/25: augmented Lagrangian method, alternating direction method of multipliers (ADMM) (note)
  21. Wed 11/27: Newton's method (note)
  22. Mon 12/02: quasi-Newton methods (note)
  23. Wed 12/04: barrier method (note)
  24. Mon 12/09: primal-dual interior point method (note)

Assignments

  1. Assignment 1 (pdf)
  2. Assignment 2 (pdf)
  3. Assignment 3 (pdf)
  4. Assignment 4 (pdf)
  5. Assignment 5 (pdf)

Final exam

(file)

Past versions

Fall 2023

Fall 2022