MACHINE LEARNING September 1, 2012
COMS4771-001
COURSE INFO
|
Time
& Location |
T/Th
2:40pm-3:55pm at 428 Pupin |
|
Instructor |
Professor
Tony Jebara, jebara(at)cs(dot)columbia(dot)edu |
|
Office
Hours |
T/Th 4:00pm-4:45pm at 605 CEPSR |
|
TAs |
Amit Sengupta, as4323(at)columbia(dot)edu
Jie Feng, jf2776(at)columbia(dot)edu
Anna Choromanska, aec2163(at)columbia(dot)edu
Garfield Xu Tan, ttanxu(at)gmail(dot)com Faiza Khan Khattak, fk2224(at)columbia(dot)edu Dingquan Wang, dw2546(at)columbia(dot)edu
|
|
Bulletin
Board |
Available via courseworks.columbia.edu and is the best
|
Prerequisites: Knowledge of linear algebra
and introductory probability or statistics.
Description: This course introduces topics in machine learning for both generative
and discriminative estimation. Material will include least squares methods, Gaussiandistributions, linear classification, linear regression, maximum likelihood, exponentialfamily distributions, Bayesian networks, Bayesian inference, mixture models, the EMalgorithm, graphical models, hidden Markov models, support vector machines, andkernel methods. Students are expected to implement several algorithms in Matlaband have some background in linear algebra and statistics.
Required
Texts:
Michael
I. Jordan and Christopher M. Bishop, Introduction to Graphical Models.
Still
unpublished. Available online (password-protected) on class home page.
Christopher M. Bishop, Pattern Recognition
and Machine Learning, Springer.
2006
First Edition is preferred. ISBN: 0387310738. 2006.
Optional
Texts: Available
at library (additional handouts will also be given).
Tony Jebara, Machine Learning: Discriminative and Generative, Kluwer, 2004
ISBN: 1402076479. Boston, MA, 2004.
R.O. Duda, P.E. Hart and D.G. Stork, Pattern
Classification, John Wiley & Sons, 2001.
Trevor
Hastie, Robert Tibshirani and Jerome Friedman, The Elements of Statistical
Learning.
Springer-Verlag New York USA, 2009. 2nd Edition. ISBN 0387848576.
Graded
Work: Grades will
be based on 5 homeworks (45%), the midterm (20%),
two surprise in-class quizzes (5%), and
the final exam (30%). Any material covered in
assigned readings,
handouts, homeworks, solutions, or lectures may appear in exams.
If you miss the midterm and don't have an official reason, you will get 0 on it.
If you have an official reason, your midterm grade will be based on the final exam.
If you miss a quizz and don't have an official reason, you will get 0 on it.
If you have an official reason, your missed quiz grade will be based on the final exam.
Tentative
Schedule:
|
Date |
Topic |
|
September 4 |
Lecture 01: Introduction |
|
September 6 |
Lecture 02: Least Squares |
|
September 11 |
Lecture 03: Linear Classification and Regression |
|
September 13 |
Lecture 04: Neural Networks and BackProp |
|
September 18 |
Lecture 05: Neural Networks and BackProp |
|
September 20 |
Lecture 06: Support Vector Machines |
|
September 25 |
Lecture 07: Support Vector Machines |
|
September 27 |
Lecture 08: Kernels and Mappings |
|
October 2 |
Lecture 09: Probability Models |
|
October 4 |
Lecture 10: Probability Models |
|
October 9 |
Lecture 11: Bernoulli Models and Naive Bayes |
|
October 11 |
Lecture 12: Multinomial Models for Text |
|
October 16 |
Lecture 13: Graphical Models Preview |
|
October 18 |
Lecture 14: Gaussian Models |
|
October 23 |
MIDTERM |
|
October 25 |
Lecture 15: Gaussian Regression and PCA |
|
October 30 |
Lecture 16: Bayesian Inference |
|
November 1 |
Lecture 17: The Exponential Family |
|
November 6 |
ELECTION DAY (NO CLASS) |
|
November 8 |
Lecture 18: Mixture Models and Kmeans Clustering |
|
November 13 |
Lecture 19: Expectation Maximization |
|
November 15 |
Lecture 20: Expectation Maximization |
|
November 20 |
Lecture 21: Graphical Models |
|
November 22 |
THANKSGIVING DAY (NO CLASS) |
|
November 27 |
Lecture 22: Graphical Models |
|
November 29 |
Lecture 23: Junction Tree Algorithm |
|
December 4 |
Lecture 24: Junction Tree Algorithm |
|
December 6 |
Lecture 25: Hidden Markov Models |
Class
Attendance: You
are responsible for all material presented in the class
lectures,
recitations, and so forth. Some material will diverge from the textbooks
so
regular attendance is important.
Late
Policy: If you
hand in late work without approval of the instructor or TAs,
you will
receive zero credit. Homework is due at the beginning of class on the
due date.
Cooperation
on Homework:
Collaboration on solutions, sharing or copying of
solutions
is not allowed. Of course, no cooperation is allowed during exams.
This
policy will be strictly enforced.
Web
Page: The class
URL is: http://www.cs.columbia.edu/~jebara/4771
and
will
contain copies of class notes, news updates and other information.
Computer
Accounts: You
will need an ACIS computer account for email, use
of Matlab
(Windows, Unix or Mac version) and so forth.