IE 539: Convex Optimization

IE 539: Convex Optimization, Fall 2022

TTh 4:00 - 5:30 pm, E2(산업경영학동) #1501

Instructor: Dabeen Lee (E2-2 #2109)
Office Hours: Wed 2:00 - 4:00 pm
Email: dabeenl [at] kaist [dot] ac [dot] kr

Teaching Assistant: Haeun Jeon (Email: haeun39 [at] kaist [dot] ac [dot] kr)

Text: No required text, but the following materials are helpful.

Recommended textbooks and lecture notes

Syllabus (pdf)

Big data has introduced many opportunities to make better decision-making based on a data-driven approach, and many of the relevant decision-making problems can be posed as optimization models that have special properties such as convexity and smoothness. From this course, a graduate-level student will learn fundamental and comprehensive convex optimization knowledge in theory (convex analysis, optimality conditions, duality) and algorithms (gradient descent and variants, Frank-Wolfe, and proximal methods). We will also cover some application areas including statistical estimation, finance (e.g., portfolio optimization), machine learning, and data science.

Announcements

- Classroom change: from #1122 to #1501
- Hybrid class on September 6
- Hybrid class on September 8
- No class on October 4 (make-up day for cancelled classes)
- No class on Octoboer 20 (midterm week)
- No class on December 1 (undergraduate entrance exam)
- Take-home final exam on December 13 (due: 11:59 pm on December 14)

Lecture notes

  1. Tue 8/30: Introduction (slides), Review of linear algebra basics I (lecture note)
  2. Thu 9/01: Review of linear algebra basics II, Multivariate calculus review, Convex sets (lecture note)
  3. Tue 9/06: Convex functions, First-order and Second-order characterizations (lecture note)
  4. Thu 9/08: Operations preserving convexity, Optimization terminologies, Convex optimization applications I (Portfolio optimization, Uncertainty quantification, Support vector machine) (lecture note)
  5. Tue 9/13: Convex optimization applications II (LASSO, Facility location), Convex optimization classes I (LP, Convex QP, SDP) (lecture note)
  6. Thu 9/15: Convex optimization classes II (Conic programming, SOCP), Conic duality (lecture note)
  7. Tue 9/20: Optimality conditions for convex minimization, Descent method (lecture note)
  8. Thu 9/22: Introduction to gradient descent, Convergence result for smooth functions (lecture note)
  9. Tue 9/27: Convergence of gradient descent, Subgradients, Optimality conditions for non-differentiable functions, Subgradient method (lecture note)
  10. Thu 9/29: Smoothness, Strong convexity, Convergence of gradient descent for smooth functions (lecture note)
  11. Thu 10/06: More properties of smooth and strongly convex functions, Convergence of gradient descent for smooth functions that are strongly convex (lecture note)
  12. Tue 10/11: Projected gradient descent, Conditional gradient descent (Frank-Wolfe algorithm), Lower bounds on the iteration complexity of gradient methods (lecture note)
  13. Thu 10/13: Accelerated gradient descent (Nesterov's acceleration), Online convex optimization (lecture note)
  14. Tue 10/18: Online binary classification, Stochastic optimization via online convex optimization, Stochastic gradient descent (lecture note)
  15. Tue 10/25: Analysis of stochastic gradient descent, Mini-batch SGD, Variance reduction (lecture note)
  16. Tue 11/01: Variance-reduced stochastic methods, SVRG (lecture note)
  17. Thu 11/03: Proximal operator, Proximal gradient descent (lecture note)
  18. Tue 11/08: KKT conditions, Lagrangian duality (lecture note)
  19. Thu 11/10: Saddle point problem, Fenchel Conjugate (lecture note)
  20. Tue 11/15: Fenchel duality (lecture note)
  21. Thu 11/17: Dual gradient ascent, Proximal point algorithm (lecture note)
  22. Tue 11/22: Moreau-Yosida smoothing, Augmented Lagrangian method (lecture note)
  23. Thu 11/24: Dual proximal gradient, ADMM, Introduction to Newton's method (lecture note)
  24. Tue 11/29: Convergence of Newton's method (lecture note)
  25. Tue 12/06: Quasi-Newton method (lecture note)

Assignments

  1. Assignment 1 (pdf)
  2. Assignment 2 (pdf)
  3. Assignment 3 (pdf)
  4. Assignment 4 (pdf)
  5. Assignment 5 (pdf)