Course related notes
Link to Tony Jebara's course from the previous semester: home page
Regression by linear combination of basis functions: [pdf]
Multi-dimensional regression: [pdf]
The perceptron: [pdf]
Lagrange Multipliers: [pdf]
Document classification with the multinomial model: [pdf]
Sampling from a Gaussian: [pdf]
Slides on exponential family distributions: [pdf]
Resources for conditional independence, graphical models (different levels, styles, should be something here for everyone; all the first 3 books also contain more advanced material for interest):
Bishop PRML book, Chapter 8
Jordan & Bishop book, (in rough order of our class) Chapters 2, 8, 3, 16, 17
MacKay book, Chapter 2
StackExchange reply to "Could someone explain conditional independence?" link
Notes by Kathy McKeown [txt], related slides from her AI course [ppt] and [ppt]
Kevin Murphy's 1998 tutorial
Talk by Zoubin Ghahramani
coursera video lectures
Graphical Models in a Nutshell, Koller et al
David Sontag's 2012 course at NYU (more advanced than ours) link
A few more:
Bishop Chapter 8 on graphical models is available here
Mark Paskin's short course on graphical models is here
An introduction by Martin Wainwright is here
A related course from 2004 by Zoubin Ghahramani is here (includes HMMs, conditional independence and many other topics
Shachter's Bayes Ball paper is here
Proofs of Junction Tree theorems (if triangulate, max spanning tree will satisfy RIP) from Marina Meila
Writing in LaTeX
A sample document together with its source and a figure [pdf] [tex] [figure ps].
In a UNIX/Linux environment use
"The Comprehensive LaTeX symbol list" by Scott Pakin [pdf]
Linear Algebra Review
A quick reference for basic operations on vectors and matrices,
with the corresponding MATLAB commands indicated [pdf].
Support Vector Machines Tutorial
Support vector machines are not in either Bishop's textbook or Jordan's textbook.
SVMs are a newer research topic that only really emerged in the later 1990's.
To find out more about SVMs, read the following popular tutorial by Chris Burges:
A Tutorial on Support Vector Machines for Pattern Recognition [ps]
Bayesian Gaussian Tutorial
If you want details on integrating all Gaussian models in Bayesian
inference (which is a little too advanced for our purposes), feel free to read
up on your own in the following optional tutorial also by Tom Minka:
Inferring a Gaussian Distribution [ps]
If you are having trouble with multinomial distributions (for counts and discrete data),
take a look at Tom Minka's tutorial:
Bayesian Inference, Entropy and the Multinomial Distribution
Matlab is one of the best tools for designing machine learning
algorithms and many of the class assignments and class projects will
be easiest to implement and explore with it. Alternatively, it is
possible to use other mathematical software like Mathematica or
MathCad although these will be much more awkward. Furthermore, it is
possible to use C/C++ or Java as the implementation platform but you
will require matrix libraries.
Matlab is available to the Columbia community through AcIS.
You just connect (i.e. using ssh) to an AcIS CUNIX machine like:
ssh -lyourusername cunix.cc.columbia.edu
And then run 'matlab' (which lives in /opt/local/bin/matlab).
See the following for more details (Windows or Unix):
Matlab Software through AcI
The Columbia University Computer Science department also has Matlab
available on various Unix machines (in /usr/local/bin/matlab).
Matlab Tutorials (from simplest to most elaborate):
MTU Introduction to Matlab
Mathworks' Matlab documentation
A Matlab tutorial on least squares to help with regression.
Example code (fits polynomial regression to x,y data):
Example code for EM for mixtures of Gaussians (needs the following 4 .m files after):
Example code (generates gaussian samples from a Gaussian mean and covariance matrix):
Example code (plots a 2D Gaussian ellipse contour):
Example code (plots several Gaussians using the above function):
Example code (randomly initializes and plots M Gaussians):
Example code (renders a vector point of gray values as an image, type 'help imageData'):