E1. Language Technology

Instructor: Ioannis Pavlopoulos

Elective, Teaching Period C, ECTS:6

Contents

N-gram language models. Entropy, cross-entropy, perplexity. Spelling correction. Bag-of-word text representations. Feature selection and extraction. Text classification with k nearest neighbors and Naive Bayes. Clustering words and texts with k-means. Logistic regression, stochastic gradient descent, multi-layer Perceptrons, backpropagation for text classification. Pre-trained word embeddings, Word2Vec, FastText. Recurrent neural networks (RNNs), GRU and LSTM cells, RNNs with self-attention, bidirectional, stacked, hierarchical RNNs and applications to language models, text classification and sequence labeling. Sequence-to-sequence RNN models, machine translation. Pre-trained RNN language models, ELMo. Convolutional neural networks and applications to text processing. Transformers, BERT. Syntactic dependency parsing and relation extraction with deep learning models. Question answering systems for document collections. Dialogue systems.

Prerequisites

Students should have basic knowledge of mathematical calculus, linear algebra, and probability theory. For the programming assignments of the course, programming experience in Python is required. Students are advised to also attend the course “Deep Learning”, but this is not required.

Target Learning Outcomes

After successfully completing the course, students will be able to:

  • describe a wide range of possible applications of Natural Language Processing,
  • describe Natural Language Processing algorithms that can be used in particular applications,
  • select and implement appropriate Natural Language Processing algorithms for particular applications,
  • evaluate the effectiveness and efficiency of Natural Language Processing methods and systems

Recommended Bibliography

  • Speech and Language Processing, Daniel Jurafsky and James H. Martin, Pearson Education, 2nd edition, 2009, ISBN-13: 978-0135041963.
  • Neural Network Methods for Natural Language Processing, Yoav Goldberg, Morgan & Claypool Publishers, 2017, ISBN-13: 978-1627052986.
  • Introduction to Natural Language Processing, Jacob Eisenstein, MIT Press, 2019, ISBN-13: 978-0262042840.
  • Foundations of Statistical Natural Language Processing, Christopher D. Manning and Hinrich Schütze, MIT Press, 1999, ISBN-13: 978-0262133609.
  • An Introduction to Information Retrieval, Christopher D. Manning, Prabhakar Raghavan and Hinrich Schütze, Cambridge University Press, 2008, ISBN-13: 978-0521865715.

Teaching and Learning Activities

One three-hour lecture per week, study exercises and programming exercises as homework (some to be submitted).

Assessment and Grading Methods

The final grade is the average of the final examination grade (50%) and the grade of the study and programming exercises to be submitted (50%), provided that the final examination grade is at least 5/10. Otherwise, the final grade equals the final examination grade

Back