MACHINE LEARNING                             September 1, 2013

COMS4771-001 COURSE INFO

 

Time & Location

T/Th 2:40pm-3:55pm at 207 Math

Instructor

Professor Tony Jebara, jebara(at)cs(dot)columbia(dot)edu

Office Hours

T/Th 1:00pm-1:45pm at 605 CEPSR

TAs

Srirarm Balasubramanian, sb3457(at)columbia(dot)edu

Sunil Khanal, sk3679(at)columbia(dot)edu

Faiza Khan Khattak, fk2224(at)columbia(dot)edu

Kaili Zhang, kz2203(at)columbia(dot)edu

Lizhong Zhang, lz2324(at)columbia(dot)edu

Bulletin Board

Available via courseworks.columbia.edu and is the best
way to contact the Professor and the TAs. Use it for
clarifications on lectures, questions about homework, etc.

 

Prerequisites: Knowledge of linear algebra and introductory probability or statistics.

 

Description: This course introduces topics in machine learning for both generative
and discriminative estimation. Material will include least squares methods, Gaussian
distributions, linear classification, linear regression, maximum likelihood, exponential
family distributions, Bayesian networks, Bayesian inference, mixture models, the EM
algorithm, graphical models, hidden Markov models, support vector machines, and
kernel methods. Students are expected to implement several algorithms in Matlab
and have some background in linear algebra and statistics.

 

Required Texts:

 

Michael I. Jordan and Christopher M. Bishop, Introduction to Graphical Models.

Still unpublished. Available online (password-protected) on class home page.

 

Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer.

2006 First Edition is preferred. ISBN: 0387310738. 2006.

 

Optional Texts: Available at library (additional handouts will also be given).

 

Tony Jebara, Machine Learning: Discriminative and Generative, Kluwer, 2004

ISBN: 1402076479. Boston, MA, 2004.

 

R.O. Duda, P.E. Hart and D.G. Stork, Pattern Classification, John Wiley & Sons, 2001.

 

Trevor Hastie, Robert Tibshirani and Jerome Friedman, The Elements of Statistical

Learning. Springer-Verlag New York USA, 2009. 2nd Edition. ISBN 0387848576.

 

Graded Work: Grades will be based on 5 homeworks (45%), the midterm (20%),

two surprise in-class quizzes (5%), and the final exam (30%). Any material covered in

assigned readings, handouts, homeworks, solutions, or lectures may appear in exams.

If you miss the midterm and don't have an official reason, you will get 0 on it.

If you have an official reason, your midterm grade will be based on the final exam.

If you miss a quizz and don't have an official reason, you will get 0 on it.

If you have an official reason, your missed quiz grade will be based on the final exam.

 

Tentative Schedule:

Date

Topic

September 3

Lecture 01: Introduction

September 5

Lecture 02: Least Squares

September 10

Lecture 03: Linear Classification and Regression

September 12

Lecture 04: Neural Networks and BackProp

September 17

Lecture 05: Neural Networks and BackProp

September 19

Lecture 06: Support Vector Machines

September 24

Lecture 07: Support Vector Machines

September 26

Lecture 08: Kernels and Mappings

October 1

Lecture 09: Probability Models

October 3

Lecture 10: Probability Models

October 8

Lecture 11: Bernoulli Models and Naive Bayes

October 10

Lecture 12: Multinomial Models for Text

October 15

Lecture 13: Graphical Models Preview

October 17

Lecture 14: Gaussian Models

October 22

MIDTERM

October 24

Lecture 15: Gaussian Regression and PCA

October 29

Lecture 16: Bayesian Inference

October 31

Lecture 17: The Exponential Family

November 5

ELECTION DAY (NO CLASS)

November 7

Lecture 18: Mixture Models and Kmeans Clustering

November 12

Lecture 19: Expectation Maximization

November 14

Lecture 20: Expectation Maximization

November 19

Lecture 21: Graphical Models

November 21

Lecture 22: Graphical Models

November 26

Lecture 23: Junction Tree Algorithm

November 28

THANKSGIVING DAY (NO CLASS)

December 3

Lecture 24: Junction Tree Algorithm

December 5

Lecture 25: Hidden Markov Models

 

 

Class Attendance: You are responsible for all material presented in the class

lectures, recitations, and so forth. Some material will diverge from the textbooks

so regular attendance is important.

 

Late Policy: If you hand in late work without approval of the instructor or TAs,

you will receive zero credit. Deadlines are non-negotiable.

 

Cooperation on Homework: Collaboration on solutions, sharing or copying of

solutions is not allowed. Of course, no cooperation is allowed during exams.

This policy will be strictly enforced.

 

Web Page: The class URL is: http://www.cs.columbia.edu/~jebara/4771 and

will contain copies of class notes, news updates and other information.

 

Computer Accounts: You will need an ACIS computer account for email, use

of Matlab (Windows, Unix or Mac version) and so forth.