IE 539: Convex Optimization
IE 539: Convex Optimization, Fall 2023
MW 4:00 - 5:30 pm, E2-2 B105
Instructor: Dabeen Lee (E2-2 #2109)
Office Hours: Tue 2:00 - 3:00 pm
Email: dabeenl [at] kaist [dot] ac [dot] kr
Teaching Assistant: Haeun Jeon (Email: haeun39 [at] kaist [dot] ac [dot] kr)
Text: No required text, but the following materials are helpful.
Recommended textbooks and lecture notes
Syllabus (pdf)
Big data has introduced many opportunities to make better decision-making based on a data-driven approach, and many of the relevant decision-making problems can be posed as optimization models that have special properties such as convexity and smoothness. From this course, a graduate-level student will learn fundamental and comprehensive convex optimization knowledge in theory (convex analysis, optimality conditions, duality) and algorithms (gradient descent and variants, Frank-Wolfe, and proximal methods). We will also cover some application areas including statistical estimation, finance (e.g., portfolio optimization), machine learning, and data science.
Lecture notes
- Mon 8/28: introduction (slides), linear algebra review (note)
- Wed 8/30: matrix calculus review, convex sets (note)
- Mon 9/04: convex functions, first-order and second-order characterizations of convex functions (note)
- Wed 9/06: operations preserving convexity, convex optimization problems I (Portfolio optimization, Uncertainty quantification) (note)
- Mon 9/11: convex optimization problems II (SVM, LASSO, Facility location), classes of convex programming I (LP) (note)
- Wed 9/13: classes of convex programming II (QP, SDP, Conic programming) (note)
- Mon 9/18: conic duality, SOCP and applications (note)
- Wed 9/20: optimality conditions, introduction to gradient descent (note)
- Mon 9/25: convergence of gradient descent (note)
- Wed 9/27: subgradient method, gradient descent for smooth functions (note)
Assignments
- Assignment 1 (pdf)
- Assignment 2 (pdf)
Past versions
Fall 2022