ADVANCED MACHINE                            January, 2013

LEARNING & PERCEPTION

COMS 4772/6772 COURSE INFO

 

Day & Time and Location

Tu/Th 02:40pm-03:55pm

1127 Mudd

Instructor

Professor Tony Jebara

jebara(at)cs(dot)columbia(dot)edu

Office Hours

CEPSR 605, Th 4-45 or by appointment

TAs

Dingquan Wang, dw2546(at)columbia(dot)edu

CEPSR 6LE5, Office Hours M/W 9-11 or by appointment

 

 

Prerequisites: COMS W4771 or permission of instructor. Knowledge of linear

algebra and introductory probability or statistics is required.

 

Description: An exploration of advanced machine learning tools for perception
and behavior learning. How can machines perceive, learn from, and classify
human activity computationally? Topics include Appearance-Based Models,
Principal and Independent Components Analysis, Dimensionality Reduction,
Kernel Methods, Manifold Learning, Latent Models, Regression, Classification,
Bayesian Methods, Maximum Entropy Methods, Real-Time Tracking, Extended
Kalman Filters, Time Series Prediction, Hidden Markov Models, Factorial HMMS,
Input-Output HMMs, Markov Random Fields, Variational Methods, Structured
Prediction, and Dynamic Bayesian Networks. Gaussian/Dirichlet Processes.

 

Required Texts:

 

Primarily through handouts and links to various research papers.

 

Optional Texts:

 

Tony Jebara, Machine Learning: Discriminative and Generative.

Michael I. Jordan and Christopher M. Bishop, Introduction to Graphical Models.

Still unpublished. Available online (password-protected) on class home page.

 

R.O. Duda, P.E. Hart and D.G. Stork, Pattern Classification, John Wiley & Sons, 2001.

 

Trevor Hastie, Robert Tibshirani and Jerome Friedman, The Elements of Statistical

Learning. Springer Series in Statistics, Springer-Verlag New York USA. 2001.

 

 

Graded Work: Grades are based on 2 applied homeworks for 45% of the grade

and a large research level project with a final presentation (55%).

 

 

Tentative Schedule:

Date

Topics: A tentative wish list, we’ll see what we can go through!

Week 1

Introduction, Review of Basic Concepts, Representation Issues, Vector and Appearance-Based Models, Correlation and Least Squared Error Methods, Bases, Eigenspace Recognition, Principal Components Analysis

Week 2

Nonlinear Dimensionality Reduction, Manifolds, kernel PCA, Locally Linear Embedding, Maximum Variance Unfolding, Minimum Volume Embedding

Week 3

Support Vector Machines and related machines, VC Dimension, Large Margin, Large Relative Margin

Week 4  

Kernel Methods, Reproducing Kernel Hilbert Space, Probabilistic Kernel Approaches, Kernel Principal Components Analysis, Bag of Vectors/Pixel Kernels

Week 5

Maximum Entropy, Iterative Scaling, Maximum Entropy Discrimination, Large Margin Probability Models

Week 6

SVM Extensions, Multi-Class Classification, Structured Prediction, Feature Selection, Kernel Selection, Meta-Learning, Semi-Supervised Learning

Week 7

Bayesian Networks, Belief Propagation, Hidden Markov Models, Markov Random Fields

Week 8

Kalman Filtering, Structure from Motion, Parameter Estimation, Coupled and Linked Hidden Markov Models, Variational and Mean-Field Methods

Week 9  

Factorial Hidden Markov Models, Switched Kalman Filters, Dynamical Bayesian Networks, Structured Mean-Field

Week 10

Graph Learning, b-Matching, Loopy Belief Propagation, Perfect Graphs

Week 11

Spectral Clustering, Random Walks, Ncuts Methods, Stability, Image Segmentation

Week 12

Boosting, Mixtures of Experts, AdaBoost, Online Learning

Week 13  

Project Presentations

Week 14

Project Presentations

 

 

Class Attendance: Class participation and interaction is an important aspect of this

course, ideally the course will run as a seminar where material presented in the class

lectures, recitations, and so forth. Some material will diverge from the textbooks

so regular attendance is important.

 

Late Policy: If you hand in late work without approval of the instructor or TAs,

you may receive zero credit. Homework is due as announced on its web page.

For the project, please submit on time regardless of additional progress.

For the final project, each day of lateness will cost you a minimum of 15%.

We won't give extensions, regardless of how amitious your project is.

 

Cooperation on Homework: Collaboration on solutions, sharing or copying of

solutions is not allowed.

 

Web Page: The class URL is: http://www.cs.columbia.edu/~jebara/6772 and

will contain copies of handouts, homework assignments, solutions and other

information.

 

Computer Accounts: You will need an ACIS computer account for email, use

of Matlab (unless you have a windows version) and so forth.