Some things to get started

Syllabus Spring 23

Topics covered

01/31 # 1
Introduction (slides on Canvas)
02/02 # 2
Math review [AAA (Lec 2 notes), BV (Appendix A)]
02/07 # 3
Optimality conditions for unconstrained optimization [notes, AAA (Lec 3), CZ (Chapter 6)]
02/09 # 4
AMGM inequality proof (example). Convex optimization problem. [AAA (Lec 3, 4)]
02/14 # 5
Convex sets, properties and examples [AAA (Lec 4, 5), BV (Chapter 2)]
02/16 # 6
Convex hulls and Caratheodory theorems (standard and approximate) [AAA (Lec 4), V (Chapter 1.3), V-HDP (Intro)]
02/21 # 7
Convex functions, their properties, convex envelopes [see next class]
02/23 # 8
Optimality of convex optimization [BV (Chapter 3), AAA (Lec 7, 8)]
02/28 # 9
Separability of convex sets, Farkas lemma and strong duality of linear programming. [AAA (Lec 5), BV (Chapter 2)],
03/02 # 10
Support vector machines [svm, BV (8.6), AAA (Lec 8)]
03/07 # 11
Konig theorem (total unimodularity and tightness of linear relaxation) [AAA (Lec 6) ]
03/09
Midterm in class.
03/21 # 12
Classes of optimization problems, from LP to SDP [AAA (Lec 9), AG , BV (Ch 4), LV (Ch 2)]
03/23 # 13
SDP problems, standard form and duality, CP duality [AAA (Lec 9), LV (Ch 2)]
03/28 # 14
Applications of SDP in eigenvalue optimization, including system stability[AAA (Lec 10), LV (Ch 2)]
03/30 # 15
Robust optimization and S-lemma [robust optimization note, AAA (Lec 16, 12), BV (Ch 4), BBC]
04/04 # 16
S-lemma proof [AAA (Lec 16), BV (Appendix B2), LV]
04/06 # 17
Lovasz SDP relaxation for the graph stability number and Shannon capacity [AAA (Lec 11), LV (Ch 6)]
04/11 # 18
Graph MaxCut SDP relaxation and Grothendieck's inequality [ V (Ch 3), LV (Ch 7)]
04/13 # 19
Goemans Williamson algorithm, Cut norm estimation, SDP for probability estimates [ V (Ch 3), LV (Ch 7), KN]
04/18 # 20
SDP for probability estimates, complexity theory [computational complexity note, KT (8.1), AAA (Lec 13)]
04/20 # 21
Complexity theory [see above]
04/25 # 22
Hardness of optimization problems [AAA (Lec 14, 15)]
04/27 # 23
Discussion [class note, AAA (Lec 15), M (Ch 4)]
Additionally, check the theory developed in the homeworks, including optimality of singular value decomposition, quasi-convex and strictly/strongly convex functions, extreme points (also in lecture 11). After midterm: various forms of SOCP, criteria for stability, tightness of Lovasz SDP approximation for the stability number, relations between decision and search problems, approximating hard problems in random time.

Main textbooks and links:

[BV]
S. Boyd and L. Vandenberghe, Convex Optimization
[LV]
M Laurent, F. Vallentin, Semidefinite Optimization
[V-HDP]
R. Vershynin, High-Dimensional Probability An Introduction with Applications in Data Science
[V]
R. Vershynin, Lectures in Geometric Functional Analysis
[CZ]
E. Chong, S. Zak, An Introduction to optimization
[AAA]
Typed lecture notes covering the majority of this class material (Instructor A.A. Ahmadi, Scribe G. Hall)
[AG]
Second-order cone programming (Alizadeh, Goldfarb)
[KN]
Grothendiek-type inequalities in combinatorial optimization (Khot, Naor)
[BBC]
Theory and applications of Robust Optimization (Bertsimas, Brown, Caramanis)
[KT]
Algorithms design (Kleinberg, Tardos)
[M]
Randomized algorithms for matrices and data (M. Mahoney)
[PW]
M. Pilanci, M. Wainwright, Randomized Sketches of Convex Programs with Sharp Guarantees
[BF]
A. Bluhm, D. S. Franca, Dimensionality reduction of SDPs through sketching

Software

You can use Matlab or Python for the homework assignments. Potentially, you could also use other common programming languages, please reach out to me or AI before doing so. Some useful links:
1.
Python-based modeling software CVXPY to solve optimization problems
2.
Matlab-based modeling software CVX to solve optimization problems
3.
Download Matlab with Princeton licence