W49951:
Introduction to Computational Learning Theory
Fall 2003
SYLLABUS
Introduction  Topics 
Prerequisites
 Grading  Readings
 Schedule of Topics
 Web Bulletin Board
 Problem Sets
Instructor: Rocco Servedio (517 CSC, office hours Wed 13pm)
Note: for the week of Sept 15 Rocco's office hours will be
Thursday, Sept 4 from 24pm.
Teaching
assistants: Andrew Howard (, Office
Hours Thur 24 CEPSR
6LE5),
Risi Kondor (, office hours TBA)
Room: 252 Engineering Terrace
Time: Tues/Thurs 11:0012:15pm.
Introduction
The question "Can machines learn from experience?"
is one that has fascinated people for a long time.
Over the past few decades, many researchers in computer science
have studied this question from a range of applied and
theoretical perspectives.
This course will give an introduction to some of the central
topics in computational learning theory. We will study
welldefined mathematical models of learning in which it is possible
to give precise and rigorous analyses of learning problems and
learning algorithms.
A big focus of the course will be the computational efficiency
of learning in these models. We'll develop computationally
efficient algorithms for certain learning problems, and will see
why efficient algorithms are not likely to exist for other
problems.
List of Topics
This is a preliminary list of "core" topics.
Other topics may be covered as well depending on how
the semester progresses. Some topics will take more than one lecture.

Introduction: what is computational learning theory (and why)?

The online mistakebound learning model. Online algorithms for simple concept
classes.

General algorithms for online learning. Lower bounds for
online learning.

The Probably Approximately Correct (PAC) learning model: definition
and examples. Online to PAC conversions.
 Occam's Razor: learning by finding a consistent hypothesis.

The VC dimension and uniform convergence.

Weak versus strong learning: boosting algorithms.

Hardness results for efficient learning based on cryptography.

PAC learning from noisy data. Malicious noise and
random classification noise. Learning from Statistical Queries.

Exact learning from membership and equivalence queries.
Learning monotone DNF and learning finite automata.
Prerequisites
You should be familiar with the following topics:

BigO notation and basic analysis of algorithms.
Chapter 3 in "Introduction to Algorithms" by
Cormen, Leicerson, Rivest and Stein is more than sufficient.

Basic discrete math and probability. The course
Computer Science 3203 and the "Mathematical Background" section
(Appendix VIII) of CLRS are good sources here.

Basic knowledge of finite automata. Chapter 1 of Sipser's book
"Introduction to the Theory of Computation," or the coverage
of finite automata in CS 3261, is more than sufficient.

Most importantly, you should be comfortable with
reading and writing proofs.
Chapter 0 of Sipser's book "Introduction to the Theory of Computation"
is a good source here.
Some basic knowledge of the theory of NPcompleteness (e.g. Chapter 7 of
Sipser or Chapter 34 of CLRS) and of elementary cryptography
(e.g. pages 881887 of CLRS) would be helpful but is not required 
the course will be selfcontained with respect to these topics.
Grading and Requirements
The requirements of the course are as follows:

Biweekly problem sets. Your solutions must be typed and submitted
electronically (70% of grade).

Final project (30% of grade).
You may do an experimental project or a theoretical one. An experimental
project might involve implementing and testing some learning algorithm.
A theoretical project might involve reading one or more research papers
to get a more indepth understanding of some topic we touched on in class,
or some other topic that you're interested in.
For either type of project, you'll give a brief presentation
and submit a written report. Talk to me before settling on a project topic.
The biweekly problem sets will be due on Thursdays by 5:00pm.
You are allowed six late days for the semester.
Each late day is exactly 24 hours; late days
cannot be subdivided  five minutes late counts as one late day. For
an exception, you must have your undergraduate advisor (for undergrads) or your
graduate advisor (for graduate students) contact me.
The problem sets will require you to do proofs. Some problems
will be challenging; you are advised to start the problem
sets early.
You are encouraged to discuss the course material and the homework
problems
with each other in small groups (23 people), as long as you list all
discussion
partners on your problem set.
Discussion of homework problems may include
brainstorming and verbally walking through possible solutions, but should
not include one person telling the others how to solve the problem.
In addition, each person must write up their solutions
independently;
you may not look at another student's written solutions.
You may consult outside materials, but all materials used must be
appropriately acknowledged, and you must always write up your
solutions in your own words.
Readings
The textbook for this course is:

M. Kearns and U. Vazirani.
An Introduction to Computational Learning Theory.
This book is available online and at the Columbia Bookstore.
It's a very good book, but several topics we'll cover are not in
the book (some pointers to papers which cover these topics will be
given).
Schedule of Topics
Click here for a preliminary course schedule.
Web Bulletin Board
Click here to get to the course bulletin board.
Problem Sets
Click here to get to the homework page.