MENU

Fun & Interesting

What is Self Attention | Transformers Part 2 | CampusX

CampusX 58,197 1 year ago
Video Not Working? Fix It Now

Self Attention is a mechanism that enables transformers to weigh the importance of different words in a sequence relative to each other. It allows the model to focus on relevant information, improving its ability to capture long-range dependencies in data. Digital Notes for Deep Learning: https://shorturl.at/NGtXg ============================ Do you want to learn from me? Check my affordable mentorship program at : https://learnwith.campusx.in/s/store ============================ 📱 Grow with us: CampusX' LinkedIn: https://www.linkedin.com/company/campusx-official CampusX on Instagram for daily tips: https://www.instagram.com/campusx.official My LinkedIn: https://www.linkedin.com/in/nitish-singh-03412789 Discord: https://discord.gg/PsWu8R87Z8 E-mail us at [email protected] ✨ Hashtags✨ #SelfAttention #DeepLearning #CampusX #NLP ⌚Time Stamps⌚ 00:00 - Intro 01:50 - What is Self Attention? 11:41 - The problem of "Average Meaning" 22:46 - Outro

Comment