Lecture recordings will be posted on Panopto.
Week | Date | Topic | Assignments | Quizzes | References |
---|---|---|---|---|---|
1 | 8/25/2021 | Introduction | |||
8/27/2021 | Text classification | Eisenstein: 2.1-2.3, 2.5-2.7, 3.1-3.4 | |||
2 | 9/1/2021 | Language modeling 1 (n-gram models, estimation) | Eisenstein: 6.1 Jurafsky & Martin: 3.1 |
||
9/3/2021 | Language modeling 2 (evaluation, information theory, smoothing) | Assignment 1 out | Eisenstein: 6.2, 6.4 Jurafsky & Martin: 3.2, 3.4, 3.5, 3.7 |
||
3 | 9/8/2021 | Language modeling 3 (log-linear models, feed-forward neural models, backprop) | Jurafsky & Martin: Chapter 7 Graham Neubig's Notes |
||
9/10/2021 | Language modeling 4 (RNNs, BPTT, attention, transformers) | Eisenstein: 6.3 Jurafsky & Martin: 9.1-9.4 |
|||
4 | 9/15/2021 | Structured prediction 1 (generative sequence models, Viterbi) | Eisenstein: 7.1-7.4; 8.1-8.3 | ||
9/17/2021 | Structured prediction 2 (discriminative sequence models, CRFs) | Eisenstein: 7.5-7.6 (excluding 7.5.1-7.5.2); Sutton & McCallum CRF tutorial | |||
5 | 9/20/2021 | N/A | Assignment 1 due! | ||
9/22/2021 | Structured prediction 3 (PCFGs, CKY) | Eisenstein: 9.2, 10.1-10.2 | |||
9/24/2021 | Structured prediction 4 (discriminative parsing, neural parsing) | Assignment 2 out | Quiz 1 | Eisenstein: 10.3-10.5 | |
6 | 9/29/2021 | Structured prediction 5 (seq2seq, decoding) | Eisenstein: 18.3-18.5 | ||
10/1/2021 | Latent variable models 1 (structured and unstructured models, inference) | Latent Variables for NLP tutorial (Kim et al., 2018) | |||
7 | 10/6/2021 | Latent variable models 2 (variational inference, VAEs) | Latent Variables for NLP tutorial (Kim et al., 2018) | ||
10/8/2021 | Formal semantics (FOL, lambda calculus, semantic parsing) | Assignment 2 due | Eisenstein: 12 | ||
8 | 10/13/2021 | Distributional semantics | Eisenstein: 14.1-14.3, 14.5, 14.6 | ||
10/15/2021 | Pretraining 1 | Assignment 3 out | Peters et al, 2018 Radford et al, 2018 |
||
9 | 10/20/2021 | Pretraining 2 | Project proposal due | Devlin et al, 2018 Raffel et al, 2019 Brown et al, 2020 |
|
10/22/2021 | Review | Crafting Papers on Machine Learning Stanford CS224N Project Tips |
|||
10 | 10/27/2021 | Information Extraction | Eisenstein: 17.1, 17.2 Gillick et al, 2019 Soares et al, 2019 |
||
10/29/2021 | QA 1 (classical, reading comprehension) | Assignment 3 due | Jurafsky & Martin: Chapter 23.1 Karpukhin et al, 2020 |
||
11 | 11/3/2021 | QA 2 (open domain, retrieval, end-to-end) | Quiz 2 | Jurafsky & Martin: Chapter 23.2 Seo et al, 2016 |
|
11/5/2021 | Text generation | Eisenstein: Chapters 18, 19 | |||
12 | 11/10/2021 | Efficient transformer variants | Efficient Transformers: A Survey (Tay et al., 2020) | ||
11/12/2021 | Ethics, Robustness, Interpretability | ||||
13 | 11/17/2021 | Project presentations | |||
11/19/2021 | Project presentations |