MENU

Fun & Interesting

Positional Encoding in Transformers | Deep Learning | CampusX

CampusX 43,417 11 months ago
Video Not Working? Fix It Now

Positional Encoding is a technique used in transformers to inject information about the position of tokens in a sequence. Since transformers lack inherent sequence order awareness, positional encodings enable the model to capture the order of words, crucial for understanding context. These encodings are added to the input embeddings, allowing the model to process and learn relationships based on token positions. Blog link - https://blog.timodenk.com/linear-relationships-in-the-transformers-positional-encoding/ Digital Notes for Deep Learning: https://shorturl.at/NGtXg ============================ Did you like my teaching style? Check my affordable mentorship program at : https://learnwith.campusx.in DSMP FAQ: https://docs.google.com/document/d/1OsMe9jGHoZS67FH8TdIzcUaDWuu5RAbCbBKk2cNq6Dk/edit#heading=h.gvv0r2jo3vjw ============================ 📱 Grow with us: CampusX' LinkedIn: https://www.linkedin.com/company/campusx-official CampusX on Instagram for daily tips: https://www.instagram.com/campusx.official My LinkedIn: https://www.linkedin.com/in/nitish-singh-03412789 Discord: https://discord.gg/PsWu8R87Z8 E-mail us at [email protected] ✨ Hashtags✨ #datascience #positionalencoding #campusx #deeplearning ⌚Time Stamps⌚ 00:00 - Intro 01:18 - Why Positional encoding is required? 07:58 - Proposing a simple solution 22:17 - The sine function as a solution 34:26 - Explaining Positional encoding 55:25 - Interesting Observations 01:04:06 - Mind blowing solution 01:10:13 - Blog for mathematical intuition 01:12:45 - Outro

Comment