COMS 4773 Spring 2024 Machine Learning Theory
COMS 4773 is a graduate-level introduction to machine learning theory.
Schedule
- Week 1 - January 16: Overview, Using expert advice
- Week 1 - January 18: Using expert advice
- Week 2 - January 23: Decision-theoretic online learning
- Week 2 - January 25: Decision-theoretic online learning (continued)
- Week 3 - January 30: Multi-armed bandits
- Week 3 - February 1: Online convex optimization
- Week 4 - February 6: Online convex optimization
- Week 4 - February 8: Probability inequalities
- Week 5 - February 13: Statistical learning
- Reading: UML Chapters 2—3
- Optional reading: Zhang, 2023 Chapter 3.1—3.2
- Week 5 - February 15: Statistical learning
- Week 6 - February 20: Uniform convergence [no lecture in class]
- Week 6 - February 22: Uniform convergence
- Reading: Uniform convergence handout (above)
- Week 7 - February 27: No lecture (Daniel traveling)
- Week 7 - February 29: No lecture (Daniel traveling)
- Week 8 - March 5: Boosting
- Week 8 - March 7: Boosting
- Week 9 - March 19: Hardness of empirical risk minimization
- Week 9 - March 21: Surrogate losses
- Reading: UML Chapter 12 (12.2.1 can just be skimmed)
- Reading: Schapire and Freund, 2012 Chapter 7.1—7.2
- Reading: Boosting handout (above) Section 6
- Week 10 - March 26: Convex optimization
- Reading: Convex optimization handout (above)
- Week 10 - March 28: Convex optimization
- Optional reading: UML Chapter 14
- Week 11 - April 2: Approximation theory and kernel methods
- Reading: UML Chapter 16
- Optional reading: UML Chapter 20
- Week 11 - April 4: Approximation theory and kernel methods
- Week 12 - April 9: Fourier methods
- Week 12 - April 11: Fourier methods
- Reading: same as previous lecture
- Week 13 - April 16: Minimax lower bounds
- Week 13 - April 18: Minimax lower bounds
- Reading: same as previous lecture
- Week 14 - April 23: Minimax lower bounds
- Reading: same as previous lecture
- Week 14 - April 25: Johnson-Lindenstrauss