This is the first video on attention mechanisms. We'll start with self attention and end with transformers.
We're going at it step by step, but if you're interested in immediately reading all about it in full detail then we might recommend these online documents:
- http://www.peterbloem.nl/blog/transformers
- http://jalammar.github.io/illustrated-transformer/
- http://d2l.ai/chapter_attention-mechanisms/attention.html
The general github repo for this playlist can be found here: https://github.com/RasaHQ/algorithm-whiteboard-resources.