MENU

Fun & Interesting

27. Backpropagation: Find Partial Derivatives

MIT OpenCourseWare 63,498 6 years ago
Video Not Working? Fix It Now

MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018 Instructor: Gilbert Strang View the complete course: https://ocw.mit.edu/18-065S18 YouTube Playlist: https://www.youtube.com/playlist?list=PLUl4u3cNGP63oMNUHXqIUcrkS2PivhN3k In this lecture, Professor Strang presents Professor Sra's theorem which proves the convergence of stochastic gradient descent (SGD). He then reviews backpropagation, a method to compute derivatives quickly, using the chain rule. Note: Videos of Lectures 28 and 29 are not available because those were in-class lab sessions that were not recorded. License: Creative Commons BY-NC-SA More information at https://ocw.mit.edu/terms More courses at https://ocw.mit.edu

Comment