IE 539: Convex Optimization

IE 539: Convex Optimization, Fall 2023

MW 4:00 - 5:30 pm, E2-2 B105

Instructor: Dabeen Lee (E2-2 #2109)
Office Hours: Tue 2:00 - 3:00 pm
Email: dabeenl [at] kaist [dot] ac [dot] kr

Teaching Assistant: Haeun Jeon (Email: haeun39 [at] kaist [dot] ac [dot] kr)

Text: No required text, but the following materials are helpful.

Recommended textbooks and lecture notes

Syllabus (pdf)

Big data has introduced many opportunities to make better decision-making based on a data-driven approach, and many of the relevant decision-making problems can be posed as optimization models that have special properties such as convexity and smoothness. From this course, a graduate-level student will learn fundamental and comprehensive convex optimization knowledge in theory (convex analysis, optimality conditions, duality) and algorithms (gradient descent and variants, Frank-Wolfe, and proximal methods). We will also cover some application areas including statistical estimation, finance (e.g., portfolio optimization), machine learning, and data science.

Lecture notes

  1. Mon 8/28: introduction (slides), linear algebra review (note)
  2. Wed 8/30: matrix calculus review, convex sets (note)
  3. Mon 9/04: convex functions, first-order and second-order characterizations of convex functions (note)
  4. Wed 9/06: operations preserving convexity, convex optimization problems I (Portfolio optimization, Uncertainty quantification) (note)
  5. Mon 9/11: convex optimization problems II (SVM, LASSO, Facility location), classes of convex programming I (LP) (note)
  6. Wed 9/13: classes of convex programming II (QP, SDP, Conic programming) (note)
  7. Mon 9/18: conic duality, SOCP and applications (note)
  8. Wed 9/20: optimality conditions, introduction to gradient descent (note)
  9. Mon 9/25: convergence of gradient descent (note)
  10. Wed 9/27: subgradient method, gradient descent for smooth functions (note)

Assignments

  1. Assignment 1 (pdf)
  2. Assignment 2 (pdf)

Past versions

Fall 2022