The main ideas behind Backpropagation are super simple, but there are tons of details when it comes time to implementing it. This video shows how to optimize three parameters in a Neural Network simultaneously and introduces some Fancy Notation.
NOTE: This StatQuest assumes that you already know the main ideas behind Backpropagation: https://youtu.be/IN2XmBhILt4
...and that also means you should be familiar with...
Neural Networks: https://youtu.be/CqOfi41LfDw
The Chain Rule: https://youtu.be/wl1myxrtQHQ
Gradient Descent: https://youtu.be/sDv4f4s2SB8
LAST NOTE: When I was researching this 'Quest, I found this page by Sebastian Raschka to be helpful: https://sebastianraschka.com/faq/docs/backprop-arbitrary.html
For a complete index of all the StatQuest videos, check out:
https://statquest.org/video-index/
If you'd like to support StatQuest, please consider...
Patreon: https://www.patreon.com/statquest
...or...
YouTube Membership: https://www.youtube.com/channel/UCtYLUTtgS3k1Fg4y5tAhLbw/join
...buying one of my books, a study guide, a t-shirt or hoodie, or a song from the StatQuest store...
https://statquest.org/statquest-store/
...or just donating to StatQuest!
https://www.paypal.me/statquest
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
https://twitter.com/joshuastarmer
0:00 Awesome song and introduction
3:01 Derivatives do not change when we optimize multiple parameters
6:28 Fancy Notation
10:51 Derivatives with respect to two different weights
15:02 Gradient Descent for three parameters
17:19 Fancy Gradient Descent Animation
#StatQuest #NeuralNetworks #Backpropagation