The Evolution of Natural Language Processing | Towards Data Science
James Briggs ・ ・
towardsdatascience.com
Attention is all you need. That is the name of the 2017 paper that introduced attention as an independent learning model — the herald of our now transformer dominant world in natural language processing (NLP).
Transformers are the new cutting-edge in NLP, and they may seem somewhat abstract — but when we look at the past decade of developments in NLP they begin to make sense.
We will cover these developments, and look at how they have led to the Transformers being used today. This article makes no assumptions in you already understanding these concepts — we will build an intuitive understanding without getting overly technical.
We will cover:
- Natural Language Neural Nets
- Recurrence
- Vanishing Gradients
- Long-Short Term Memory
- Attention
- Attention is All You Need
- Self-Attention
- Multi-Head Attention
- Positional Encoding
- Transformers
Top comments (0)