Office Hours (in NLP lab (7LW1 CEPSR) unless otherwise stated):
Monday 9.30am-11am, Emily Li
Monday 12.30pm-2pm, Arthur Chen
Wednesday 4pm-5.30pm, Alyssa Huang
Friday 11.10am-12.40pm, David Wan
Past midterms for the class are here: fall 2011, fall 2012, fall 2013, fall 2014, fall 2017, spring 2018.
|Date||Topics||Video Lectures||References||Flipped Classroom Materials|
|Week 1 (January 20th-24th)||Introduction to NLP,
Language Modeling (Slides: we will cover slides 1-50 inclusive)
|Video lectures in Courseworks: All of Module 1-2; All of Module 3; Sections 4.1, 4.2 in Module 4.||Sections 1.1-1.4.1 inclusive of
Notes on language modeling (required reading).
||Questions (part 1), Solutions (part 1) Questions (part 2), Solutions (part 2)||Week 2 (January 27th-31st)||Tagging, and Hidden Markov Models (Slides)||All videos in Module 6 in courseworks: The tagging problem (10:01) to Summary (1:50) inclusive.||Notes on tagging problems, and hidden Markov models (required reading)
|Week 3 (February 3rd-7th)||Log-Linear Models (Slides)||All videos in Module 15 on Courseworks.||Notes on Log-Linear Models (required reading)
||Questions, Solutions, Past midterm question|
|Week 4 (February 10th-14th)||Parsing, and Context-free Grammars (Slides)||Courseworks videos:  All of Module 7;  Module 8, sections 8-1 to 8-3 inclusive||Questions, Solutions|
|Week 5 (February 17th-21st)||Probabilistic Context-free Grammars (continued), and lexicalized context-free grammars (Slides part 1) (Slides part 2), (Slides part 3)||Courseworks videos:  Module 8, sections 8-4 to 8-6;  All of Module 9;  All of Module 10||Notes on Probabilistic Context-Free Grammars (required reading)
|Week 6 (February 24th-February 28th)||Log-Linear Models for Tagging, and for history-based parsing (Slides part 1), (Slides part 2).||Modules 16 and 17 in courseworks.||Notes on MEMMs (Log-Linear Tagging Models) (required reading)
||Questions on CRFs, solutions are in section 4 of this note. Additional questions, Solutions|
|Week 7 (March 2nd-March 6th)||Feedforward Neural Networks (Slides)||Module 22 videos in Courseworks.
|Notes on Feedforward Neural Networks (required reading)
|Week 9 (March 26th-March 31st)||Computational Graphs, and Backpropagation (Slides)||Module 23 videos in Courseworks.
|Notes on Computational Graphs, and Backpropagation (required reading)
|Week 10 (April 2nd-7th)||Word Embeddings in Feedforward Networks; Tagging and Dependency Parsing using Feedforward Networks (Slides)||Module 24 videos in Courseworks.
|Week 11 (April 9th-14th)||Recurrent Networks, and LSTMs, for NLP (Slides)||Module 25 in Courseworks.||Questions, Solutions|
|Week 12 (April 16th-21st)||Recurrent Networks, and attention, for statistical machine translation (Slides)||Module 26 in Courseworks.||Questions, Solutions|
|Week 13 (April 23rd-28th)||Brown Clustering, and Word2Vec (Brown Clustering Slides), (Word2Vec Slides).||Module 18 in Courseworks.||Word2Vec paper (we will cover in the flipped classroom section)||Questions, Solutions|
|Week 14 (April 30th-May 4th)||TBD|