STCS6701: Probabilistic Models and Machine Learning
Fall 2022
Columbia University

Course information


The course is open to all PhD students at Columbia University. Space permitting, it is open to Masters students and undergraduates. We hope and expect that any qualified student will be able to take the course.

To apply for the course, please join the waiting list, fill out the survey (forthcoming), and come to the first lecture.

Topics and readings

Below are the topics of the class and some readings about each. (These topics and readings are subject to change.)

The readings are at different levels: some are basic and some are advanced. We chose them to provide fundamental and other interesting material about the topics; the lectures will not necessarily cover or follow all of this material.

  1. Introduction
  2. The ingredients of probabilistic models
  3. Bayesian mixture models and the Gibbs sampler
  4. Mixed-membership models, topic models, and variational inference
  5. Matrix factorization and efficient MAP inference
  6. Deep generative models and black box variational inference
  7. Exponential families, conjugate priors, and generalized linear models
  8. Hierarchical models, robust models, and empirical Bayes
  9. The theory of graphical models
  10. Advanced topics in variational inference
  11. Model criticism and model diagnosis
  12. An introduction to causality