MENU

Fun & Interesting

The Attention Mechanism 1 hour explanation

Vizuara 4,556 lượt xem 1 month ago
Video Not Working? Fix It Now

In this video, we deep dive into the history of the attention mechanism and also understand self-attention.

The lecture will teach you the following:

(1) History of Generative AI
(2) Issues with RNNs and LSTMs
(3) The need for attention
(4) What is the attention mechanism?
(5) What is self attention?
(6) Why do we need trainable matrices like Query, Key and Value?

======================================================

This video is sponsored by invideoAI (https://invideo.io/).

invideoAI is looking for talented engineers, junior research scientists and research scientists to join their team.

Elixir/Rust full stack engineer:
https://invideo.notion.site/Elixir-Rust-full-stack-engineer-158316ee111a8044846be07038d3e481

Research scientist - generative AI:
https://invideo.notion.site/Research-scientist-generative-AI-17c316ee111a8096bae4c7669b602dec

If you want to apply for any of the ML or engineering roles, reach out to them at careers@invideo.io

======================================================

Comment