MACHINE LEARNING                             September 8, 2015

COMS4771-001 COURSE INFO

 

Time & Location

T/Th 10:10am-11:25am at 501 NWC

Instructor

Professor Tony Jebara, jebara(at)cs(dot)columbia(dot)edu

Office Hours

W 2:00pm-4:00pm at 605 CEPSR

TAs

Robert Dadashi-Tazehozi, rd2669(at)columbia(dot)edu

Henrique Spyra Gubert, hs2807(at)columbia(dot)edu

Chang Chen, cc3757(at)columbia(dot)edu

Jialu Zhong, jz2612(at)columbia(dot)edu

Robert Ying, ry2242(at)columbia(dot)edu

Michelle Tadmor, mdt2125(at)columbia(dot)edu

Bulletin Board

Available via courseworks.columbia.edu and is the best
way to contact the Professor and the TAs. Use it for
clarifications on lectures, questions about homework, etc.

 

Prerequisites: Knowledge of linear algebra and introductory probability or statistics.

 

Description: This course introduces topics in machine learning for both generative
and discriminative estimation. Material will include least squares methods, Gaussian
distributions, linear classification, linear regression, maximum likelihood, exponential
family distributions, Bayesian networks, Bayesian inference, mixture models, the EM
algorithm, graphical models, hidden Markov models, support vector machines, and
kernel methods. Students are expected to implement several algorithms in Matlab
and have some background in linear algebra and statistics.

 

Required Texts:

 

Michael I. Jordan and Christopher M. Bishop, Introduction to Graphical Models.

Still unpublished. Available online via courseworks.columbia.edu

 

Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer.

2006 First Edition is preferred. ISBN: 0387310738. 2006.

 

Optional Texts: Available at library (additional handouts will also be given).

 

Tony Jebara, Machine Learning: Discriminative and Generative, Kluwer, 2004

ISBN: 1402076479. Boston, MA, 2004.

 

R.O. Duda, P.E. Hart and D.G. Stork, Pattern Classification, John Wiley & Sons, 2001.

 

Trevor Hastie, Robert Tibshirani and Jerome Friedman, The Elements of Statistical

Learning. Springer-Verlag New York USA, 2009. 2nd Edition. ISBN 0387848576.

 

Graded Work: Grades will be based on 5 homeworks (45%), the midterm (20%),

two surprise in-class quizzes (5%), and the final exam (30%). Any material covered in

assigned readings, handouts, homeworks, solutions, or lectures may appear in exams.

If you miss the midterm and don't have an official reason, you will get 0 on it.

If you have an official reason, your midterm grade will be based on the final exam.

If you miss a quizz and don't have an official reason, you will get 0 on it.

If you have an official reason, your missed quiz grade will be based on the final exam.

 

Tentative Schedule:

Date

Topic

September 8

Lecture 01: Introduction

September 10

Lecture 02: Least Squares

September 15

Lecture 03: Linear Classification and Regression

September 17

Lecture 04: Neural Networks and BackProp

September 22

Lecture 05: Neural Networks and BackProp

September 24

Lecture 06: Support Vector Machines

September 29

Lecture 07: Support Vector Machines

October 1

Lecture 08: Kernels and Mappings

October 6

Lecture 09: Probability Models

October 8

Lecture 10: Probability Models

October 13

Lecture 11: Bernoulli Models and Naive Bayes

October 15

Lecture 12: Multinomial Models for Text

October 20

Lecture 13: Graphical Models Preview

October 22

MIDTERM

October 27

Lecture 14: Gaussian Models

October 29

Lecture 15: Gaussian Regression and PCA

November 3

ELECTION DAY (NO CLASS)

November 5

Lecture 16: Bayesian Inference

November 10

Lecture 17: The Exponential Family

November 12

Lecture 18: Mixture Models and Kmeans Clustering

November 17

Lecture 19: Expectation Maximization

November 19

Lecture 20: Expectation Maximization

November 24

Lecture 21: Graphical Models

November 26

THANKSGIVING DAY (NO CLASS)

December 1

Lecture 22: Graphical Models

December 3

Lecture 23: Junction Tree Algorithm

December 8

Lecture 24: Junction Tree Algorithm

December 10

Lecture 25: Hidden Markov Models

December 15

COMPREHENSIVE FINAL EXAM 9am-12pm

 

 

Class Attendance: You are responsible for all material presented in the class

lectures, recitations, and so forth. Some material will diverge from the textbooks

so regular attendance is important.

 

Late Policy: If you hand in late work without approval of the instructor or TAs,

you will receive zero credit. Deadlines are non-negotiable.

 

Cooperation on Homework: Collaboration on solutions, sharing or copying of

solutions is not allowed. Of course, no cooperation is allowed during exams.

This policy will be strictly enforced.

 

Web Page: The class URL is: http://www.cs.columbia.edu/~jebara/4771 and

will contain copies of class notes, news updates and other information.

 

Matlab: We'll use Matlab for coding, download it at www.cs.columbia.edu

by clicking on: -> Computing -> Software -> Matlab.

Note: use JDK 1.6 instead of JDK 1.7.