The question "Can machines think?" is one that has fascinated people for a long time. This is pretty close to the question "Can machines learn?", which has been studied from different points of view by many researchers in computer science.

This course will give an introduction to some of the central topics in computational learning theory, a field which approaches the above question from a theoretical computer science perspective. We will study well-defined mathematical and computational models of learning in which it is possible to give precise and rigorous analyses of learning problems and learning algorithms. A big focus of the course will be the computational efficiency of learning in these models. We'll develop computationally efficient algorithms for certain learning problems, and will see why efficient algorithms are not likely to exist for other problems.

**Courseworks**: Has Zoom links and passwords for the class and for office hours.**Piazza**: Forum for asking questions about course material and homework assignments.**Gradescope**: Submit assignments and view grades here. (You can set up Gradescope clicking the "Gradescope" module on Courseworks.)

**Anonymous Feedback Form**: Help the staff make this course better! We are eager to hear from you.

**Instructor:** Rocco Servedio

**Instruction modality:** Hybrid (Lectures for the weeks of **Jan 11-15** and **Jan 18-22** will be online only!)

**Classroom:** 309 Havemeyer Hall

**Time:** Mon/Wed 8:40am-9:55am Eastern Time (UTC -5:00)

**Course email** (for administrative issues; use Piazza for subject matter questions): coms4252columbias21 at gmail dot com

This is a preliminary list of core topics. Other topics may be covered depending on how the semester progresses. Most topics will take several lectures. For more information, click on the "Lectures" tab above.

- Introduction: What is computational learning theory (and why)? Basic notions (learning models, concept classes).
- The online mistake-bound learning model. Online algorithms for simple learning problems (elimination, Perceptron, Winnow).
- General algorithms and lower bounds for online learning (halving algorithm, Weighted Majority algorithm, VC dimension).
- The Probably Approximately Correct (PAC) learning model: definition and examples. Online to PAC conversions.
- Occam's Razor: learning by finding a consistent hypothesis. Relation to computationally efficient learning.
- The VC dimension and uniform convergence.
- Weak versus strong learning: accuracy boosting algorithms.
- PAC learning from noisy data. Malicious noise and random classification noise. Learning from Statistical Queries.
- Computational hardness results for efficient learning based on cryptography.
- Exact learning from membership and equivalence queries. Learning monotone DNF and learning finite automata.