Final Topics

General Linguistics Concepts

(Parts of J&M 2nd ed. Ch 1 but mostly split over different chapters in J&M. Ch.1 not yet available in 3rd ed.)

Text Processing

(Split over different chapters in J&M. Parts of J&M 3rd ed. Ch. 6)

Text Classification

(J&M 3rd ed. Ch. 6)

Probability Background

Language Models

(J&M 2nd ed. Ch 4.1-4.8, J&M 3rd ed. Ch 4)

Sequence Labeling (POS tagging)

(J&M 2nd ed Ch 5.1-5.5, J&M 3rd ed. Ch 10.1-10.4)

Parsing with Context Free Grammars

(J&M 2nd ed Ch. 12 and 13.1-13.4 and 14.1-14.4 and Ch. 16, J&M 3rd ed. Ch. 11.1-11.5 and Ch. 12.1-12.2 [Earley not covered in 3rd. ed] and Ch 13.1-13.4, [complexity classes not covered in 3rd ed.] )

Dependency parsing

(Not in J&M 2nd ed., J&M 3rd ed. Ch 14.1-14.5, Supplementary material: Küber, McDonald, and Nivre (2009): Dependency Parsing, Ch.3 to 4.2)

Machine Learning

(Some textbook references below)

Two Approaches to Language Meaning

Formal Lexical Semantics

(J&M 3rd ed. ch 17, J&M 2nd ed. ch 19.1-19.3 and 20.1-20.4 )

Distributional (Vector-based) Lexical Semantics

(J&M 3rd ed. ch 15 & 16, not in 2nd ed.)

Recurrent Neural Nets

Transformer Based Models

(not in textbook)

Semantic Role Labeling

(J&M Ch. 22)

Statistical MT

(J&M 2nd ed. Ch, also see Michael Collins' notes on IBM M2)