MENU

Fun & Interesting

Multi Head Attention in Transformer Neural Networks with Code!

CodeEmporium 59,073 lượt xem 2 years ago
Video Not Working? Fix It Now

Let's talk about multi-head attention in transformer neural networks

Let's understand the intuition, math and code of Self Attention in Transformer Neural Networks

ABOUT ME
β­• Subscribe: https://www.youtube.com/c/CodeEmporium?sub_confirmation=1
πŸ“š Medium Blog: https://medium.com/@dataemporium
πŸ’» Github: https://github.com/ajhalthor
πŸ‘” LinkedIn: https://www.linkedin.com/in/ajay-halthor-477974bb/

RESOURCES
[ 1πŸ”Ž] Code for video: https://github.com/ajhalthor/Transformer-Neural-Network/blob/main/Mutlihead_Attention.ipynb
[2 πŸ”Ž] Transformer Main Paper: https://arxiv.org/abs/1706.03762
[3 πŸ”Ž] Bidirectional RNN Paper: https://deeplearning.cs.cmu.edu/F20/document/readings/Bidirectional%20Recurrent%20Neural%20Networks.pdf


PLAYLISTS FROM MY CHANNEL
β­• ChatGPT Playlist of all other videos: https://youtube.com/playlist?list=PLTl9hO2Oobd9coYT6XsTraTBo4pL1j4HJ
β­• Transformer Neural Networks: https://youtube.com/playlist?list=PLTl9hO2Oobd_bzXUpzKMKA3liq2kj6LfE
β­• Convolutional Neural Networks: https://youtube.com/playlist?list=PLTl9hO2Oobd9U0XHz62Lw6EgIMkQpfz74
β­• The Math You Should Know : https://youtube.com/playlist?list=PLTl9hO2Oobd-_5sGLnbgE8Poer1Xjzz4h
β­• Probability Theory for Machine Learning: https://youtube.com/playlist?list=PLTl9hO2Oobd9bPcq0fj91Jgk_-h1H_W3V
β­• Coding Machine Learning: https://youtube.com/playlist?list=PLTl9hO2Oobd82vcsOnvCNzxrZOlrz3RiD


MATH COURSES (7 day free trial)
πŸ“• Mathematics for Machine Learning: https://imp.i384100.net/MathML
πŸ“• Calculus: https://imp.i384100.net/Calculus
πŸ“• Statistics for Data Science: https://imp.i384100.net/AdvancedStatistics
πŸ“• Bayesian Statistics: https://imp.i384100.net/BayesianStatistics
πŸ“• Linear Algebra: https://imp.i384100.net/LinearAlgebra
πŸ“• Probability: https://imp.i384100.net/Probability

OTHER RELATED COURSES (7 day free trial)
πŸ“• ⭐ Deep Learning Specialization: https://imp.i384100.net/Deep-Learning
πŸ“• Python for Everybody: https://imp.i384100.net/python
πŸ“• MLOps Course: https://imp.i384100.net/MLOps
πŸ“• Natural Language Processing (NLP): https://imp.i384100.net/NLP
πŸ“• Machine Learning in Production: https://imp.i384100.net/MLProduction
πŸ“• Data Science Specialization: https://imp.i384100.net/DataScience
πŸ“• Tensorflow: https://imp.i384100.net/Tensorflow

TIMSTAMPS
0:00 Introduction
0:33 Transformer Overview
2:32 Multi-head attention theory
4:35 Code Breakdown
13:47 Final Coded Class

Comment