Explore two learning algorithms for neural networks: stochastic gradient descent and an evolutionary algorithm known as a local search. They fundamentally solve the same problem in similar ways, but one has the advantage. Step-by-step they find a way down Loss Mountain. Watch real neural networks maximize the fitness of curve fitting. We've got Dogson here!
Special thanks to Andrew Carr(https://x.com/andrew_n_carr) and Josh Greaves for reviewing this with their human neurons, and to the artificial neurons of Grok, o3 mini, and claude. Grok thought the gay joke was funny, o3 thought it wasn't inclusive lol. it is inclusive!
~Webtoys~
Hill Climbers: https://neuralpatterns.io/hill_climber.html
Neuron Tuner: https://neuralpatterns.io/nn_tuner.html
Subscribe to my music guy NOW: https://www.youtube.com/channel/UCIhjfe2-Xdjtp7LoqVLIcaA
~Links~
Patreon: https://www.patreon.com/c/emergentgarden
Kofi: https://ko-fi.com/emergentgarden
My Twitter: https://twitter.com/max_romana
My Bluesky: https://bsky.app/profile/emergentgarden.bsky.social
My Other NN videos: https://www.youtube.com/playlist?list=PL_UEf8P1IjTjsbPasIQf3jWfQnM0xt0ZN
Webtoy Source: https://github.com/MaxRobinsonTheGreat/hillclimbers
Animation Source: https://github.com/MaxRobinsonTheGreat/ManimApproximations
Image Approximators: https://github.com/MaxRobinsonTheGreat/mandelbrotnn
FUNCTIONS DESCRIBE THE WORLD: https://www.youtube.com/watch?v=zHU1xH6Ogs4
Dawkins Climbing Mount Improbable: https://www.youtube.com/watch?v=2X1iwLqM2t0
But he's gay: https://www.youtube.com/watch?v=K1Y6PchDYfw
~Citations~
Unfortunately many of these are behind paywalls
NNs are Universal Function Approximators: https://www.cs.cmu.edu/~epxing/Class/10715/reading/Kornick_et_al.pdf
Backpropagation: https://www.nature.com/articles/323533a0
Loss Surfaces of MLPs: https://arxiv.org/abs/1412.0233
~Timestamps~
(0:00) Learning Learning
(1:20) Neural Network Space
(3:40) The Loss Landscape
(7:21) The Blind Mountain Climber
(8:37) Evolution (Local Search)
(13:07) Gradient Descent
(18:40) The Gradient Advantage
(20:48) The Evolutionary (dis)advantage