Instructors: Sam Wiseman (swiseman@cs.duke.edu), Bhuwan Dhingra (bdhingra@cs.duke.edu)
Lectures: Wednesday and Friday 1.45pm – 3.00pm in LSRC D106
TA: Yu Tang (yu.tang@duke.edu)
This class is a graduate-level introduction to the methodologies underlying modern natural language processing (NLP), the study of computing systems which process human languages. Such systems are common around us today—search engines, document editors, smart speakers all use NLP—and are used widely in industry for analyzing textual data. The class is intended for graduate students and upper-level CS undergraduates.
The course will cover a wide range of NLP tasks and the methods for solving them, which largely consist of machine learning and more recently deep learning. The lectures will cover the mathematics behind these methods and the assignments will require students to implement some of them using a standard machine learning library (Pytorch). By the end of the course, students should be familiar with the main applications of NLP in academia and industry. They should be able to identify appropriate techniques for tackling them, read research papers about those techniques and implement some of the simpler ones in Python. Topics include language modeling, generative and discriminative models of sequences and trees, latent variable models, formal semantic methods, pretraining, and transfer learning.
Programming assignments will involve building scalable machine learning systems for various NLP tasks. The final project will require students to either pursue original research on an NLP problem or reproduce and analyze results from a prior paper in teams of up to 4.
Prerequisites: There are no formal requirements, but students are expected to have completed an undergraduate-level machine learning or statistical inference course. Comfort with linear algebra and programming in python will also be beneficial.