Introduction to Computational Learning Theory
Introduction | Topics |
| Grading | Readings
| Schedule of Topics
| Web Bulletin Board
| Problem Sets
Instructor: Rocco Servedio (517 CSC, office hours Wed 1-3pm)
Note: for the week of Sept 1-5 Rocco's office hours will be
Thursday, Sept 4 from 2-4pm.
assistants: Andrew Howard (, Office
Hours Thur 2-4 CEPSR
Risi Kondor (, office hours TBA)
Room: 252 Engineering Terrace
Time: Tues/Thurs 11:00-12:15pm.
The question "Can machines learn from experience?"
is one that has fascinated people for a long time.
Over the past few decades, many researchers in computer science
have studied this question from a range of applied and
This course will give an introduction to some of the central
topics in computational learning theory. We will study
well-defined mathematical models of learning in which it is possible
to give precise and rigorous analyses of learning problems and
A big focus of the course will be the computational efficiency
of learning in these models. We'll develop computationally
efficient algorithms for certain learning problems, and will see
why efficient algorithms are not likely to exist for other
List of Topics
This is a preliminary list of "core" topics.
Other topics may be covered as well depending on how
the semester progresses. Some topics will take more than one lecture.
Introduction: what is computational learning theory (and why)?
The online mistake-bound learning model. Online algorithms for simple concept
General algorithms for online learning. Lower bounds for
The Probably Approximately Correct (PAC) learning model: definition
and examples. Online to PAC conversions.
- Occam's Razor: learning by finding a consistent hypothesis.
The VC dimension and uniform convergence.
Weak versus strong learning: boosting algorithms.
Hardness results for efficient learning based on cryptography.
PAC learning from noisy data. Malicious noise and
random classification noise. Learning from Statistical Queries.
Exact learning from membership and equivalence queries.
Learning monotone DNF and learning finite automata.
You should be familiar with the following topics:
Some basic knowledge of the theory of NP-completeness (e.g. Chapter 7 of
Sipser or Chapter 34 of CLRS) and of elementary cryptography
(e.g. pages 881-887 of CLRS) would be helpful but is not required --
the course will be self-contained with respect to these topics.
Big-O notation and basic analysis of algorithms.
Chapter 3 in "Introduction to Algorithms" by
Cormen, Leicerson, Rivest and Stein is more than sufficient.
Basic discrete math and probability. The course
Computer Science 3203 and the "Mathematical Background" section
(Appendix VIII) of CLRS are good sources here.
Basic knowledge of finite automata. Chapter 1 of Sipser's book
"Introduction to the Theory of Computation," or the coverage
of finite automata in CS 3261, is more than sufficient.
Most importantly, you should be comfortable with
reading and writing proofs.
Chapter 0 of Sipser's book "Introduction to the Theory of Computation"
is a good source here.
Grading and Requirements
The requirements of the course are as follows:
The biweekly problem sets will be due on Thursdays by 5:00pm.
You are allowed six late days for the semester.
Each late day is exactly 24 hours; late days
cannot be subdivided -- five minutes late counts as one late day. For
an exception, you must have your undergraduate advisor (for undergrads) or your
graduate advisor (for graduate students) contact me.
Biweekly problem sets. Your solutions must be typed and submitted
electronically (70% of grade).
Final project (30% of grade).
You may do an experimental project or a theoretical one. An experimental
project might involve implementing and testing some learning algorithm.
A theoretical project might involve reading one or more research papers
to get a more in-depth understanding of some topic we touched on in class,
or some other topic that you're interested in.
For either type of project, you'll give a brief presentation
and submit a written report. Talk to me before settling on a project topic.
The problem sets will require you to do proofs. Some problems
will be challenging; you are advised to start the problem
You are encouraged to discuss the course material and the homework
with each other in small groups (2-3 people), as long as you list all
partners on your problem set.
Discussion of homework problems may include
brainstorming and verbally walking through possible solutions, but should
not include one person telling the others how to solve the problem.
In addition, each person must write up their solutions
you may not look at another student's written solutions.
You may consult outside materials, but all materials used must be
appropriately acknowledged, and you must always write up your
solutions in your own words.
The textbook for this course is:
M. Kearns and U. Vazirani.
An Introduction to Computational Learning Theory.
This book is available on-line and at the Columbia Bookstore.
It's a very good book, but several topics we'll cover are not in
the book (some pointers to papers which cover these topics will be
Schedule of Topics
Click here for a preliminary course schedule.
Web Bulletin Board
Click here to get to the course bulletin board.
Click here to get to the homework page.