Encoder-Only Transformers are the backbone for RAG (retrieval augmented generation), sentiment analysis and classification problems, and clustering. This StatQuest covers the main ideas of how these powerhouses do what they do so well, making sure each step is clearly explained!
NOTE: If you'd like to learn more details about the various components mentioned in the video, check out these 'Quests:
Transformers: https://youtu.be/zxQyTK8quyY
Decoder-Only Transformers: https://youtu.be/bQ5BoolX9Ag
The Matrix Math Behind Transformers: https://youtu.be/KphmOJnLAdI
Coding a Decoder-Only Transformer from Scratch in PyTorch: https://youtu.be/C9QSpl5nmrY
Word Embedding: https://youtu.be/viZrOnJclY0
Logistic Regression: https://youtu.be/yIYKR4sgzI8
For a complete index of all the StatQuest videos, check out:
https://statquest.org/video-index/
If you'd like to support StatQuest, please consider...
Patreon: https://www.patreon.com/statquest
...or...
YouTube Membership: https://www.youtube.com/channel/UCtYLUTtgS3k1Fg4y5tAhLbw/join
...buying one of my books, a study guide, a t-shirt or hoodie, or a song from the StatQuest store...
https://statquest.org/statquest-store/
...or just donating to StatQuest!
https://www.paypal.me/statquest
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
https://twitter.com/joshuastarmer
0:00 Awesome song and introduction
3:30 Word Embedding
11:15 Positional Encoding
12:39 Attention
15:17 Applications of Encoder-Only Transformers
16:19 RAG (Retrieval-Augmented Generation)
#StatQuest #transformers