Rocco's lecture notes will be posted soon after the class. You are highly encouraged to attend all classes.
Warning: the notes below were generated in real time and have not been edited. They may contain typos or other errors. They may also contain aesthetically displeasing color combinations.
|1||Mon January 11||Introduction, basics|
|2||Wed January 13||Basics, Online mistake-bound learning, elimination algorithm||Blum survey sec. 3.0, 3.1|
|Mon January 18||No Class: MLK Day|
|3||Wed January 20||Learning decision lists, Winnow1||Blum survey sec. 3.2, Littlestone paper sec. 5 (just through Theorem 7)|
|4||Mon January 25||Winnow1, Winnow2, Perceptron||Blum survey sec. 3.2, Littlestone paper sec. 5 (just through Theorem 7), handout on Perceptron and kernel methods|
|5||Wed January 27||Perceptron, kernel methods, general bounds on OLMB learning||handout on Perceptron and kernel methods, Littlestone paper sec. 1-3 (don't worry about the stuff about the SOA)|
|6||Mon February 1||General bounds on OLMB learning: Halving Algorithm, VC Dimension||Blum survey sec. 2.0, 2.1, 2.2, Littlestone paper sec. 1-3 (don't worry about the stuff about the SOA)|
|7||Wed February 3||General bounds on OLMB learning: (Randomized) Weighted Majority; start PAC learning||Blum survey sec. 2.0, 2.1, 2.2, Kearns and Vazirani chapter 1.1-1.3|
|8||Mon February 8||PAC learning: definition, learning intervals, OLMB-to-PAC conversion||Kearns and Vazirani chapter 1.1-1.3|
|9||Wed February 10||PAC learning: more definitional subtleties, hypothesis testing / probability basics, learning via consistent hypotheses||Kearns and Vazirani chapters 1,2, appendix (Chapter 9)|
|10||Mon February 15||PAC learning: consistent hypothesis finders, Occam's razor, application to learning sparse disjunctions||Kearns and Vazirani chapters 1,2|
|11||Wed February 17||PAC sample-efficient learning sparse disjunctions via Occam and greedy set cover, proper versus improper learning||Kearns and Vazirani chapters 1,2|
|12||Mon February 22||Proper PAC learning of 3-term DNF is hard||Kearns and Vazirani chapters 1,2|
|13||Mon March 8||Lower bound on PAC learning sample complexity based on VC dimension; Sauer-Shelah lemma||Kearns and Vazirani chapter 3|
|14||Wed March 10||Sauer-Shelah lemma; PAC learning using CHF with VC dimension controlling sample complexity required||Kearns and Vazirani chapter 3||15||Mon March 15||PAC learning using CHF with VC dimension controlling sample complexity required; learning LTFs over Euclidean space; confidence boosting; start accuracy boosting||Kearns and Vazirani chapter 3||16||Wed March 17||Boosting overview; simple 3-stage accuracy improving procedure||Kearns and Vazirani chapter 4 through 4.3.2, Schapire boosting overview paper||17||Mon March 22||Boosting by filtering, boosting by sampling; AdaBoost||Schapire boosting overview paper (through 8.3)||18||Wed March 24||Finish AdaBoost proof, start learning with noise||Schapire boosting overview paper (through 8.3), Kearns and Vazirani chapter 5||19||Mon March 29||Lower bounds and algorithmic strategies for malicious noise; start random classification noise||Kearns and Vazirani chapter 5 (through Section 5.2)||20||Wed March 31||Random Classification Noise, Statistical Query Learning||Kearns and Vazirani chapter 5||21||Mon April 5||Statistical Query Learning, PAC learning with RCN, lower bounds on SQ learning||Kearns and Vazirani chapter 5||22||Wed April 7||Noise-tolerant learning of PAR, noise and proper learning, start representation-independent hardness of learning||Kearns and Vazirani chapter 6||23||Mon April 12||Computational hardness of learning based on pseudorandomness||Kearns and Vazirani chapter 6||24||Wed April 14||Computational hardness of learning based on public-key cryptography / one-way permutations||Kearns and Vazirani chapter 6|
Here is an anticipated list of topics. Note that the ordering of some topics may change, and we may spend more or less than one lecture per topic.