Dan Roth

Course Description

Making decisions in natural language processing problems often involves assigning values to sets of interdependent variables where the expressive dependency structure can influence, or even dictate, what assignments are possible. Structured learning problems such as semantic role labeling provide one such example, but the setting is broader and includes a range of problems such as name entity and relation recognition and co-reference resolution. The setting is also appropriate for cases that may require a solution to make use of multiple models (possible pre-designed or pre-learned components) as in summarization, textual entailment and question answering.

This semester, we will devote the course to the study of structured learning problems in natural language processing. We will start by recalling the “standard” learning formulations as used in NLP, move to formulations of multiclass classification and from then on focus on models of structure predictions and how they are being used in NLP.

Through lectures and paper presentations the course will introduce some of the central learning frameworks and techniques that have emerged in this area over the last few years, along with their application to multiple problems in NLP and Information Extraction. The course will cover:

Models: We will present both discriminative models such as structured Perceptron and Structured SVM, Probabilistic models, and Constrained Conditional Models.

Training Paradigms: Joint Learning models; Decoupling learning from Inference; Constrained Driven Learning; Semi-Supervised Learning of Structure; Indirect Supervision

Inference: Constrained Optimization Models, Integer Linear Programming, Approximate Inference, Dual Decomposition.

location

www.cis.upenn.edu

www.cis.upenn.edu

www.cis.upenn.edu

Advertisements