In the previous lecture we saw that single variable optimization mostly comes down to finding critical points. That is, finding when the derivative of a function is equal to zero. In practice this can be a difficult process with no way of explicitly getting these roots. In this lecture we review Newton's method for finding roots of functions and show how it can help us to solve optimization problems and perform sensitivity analysis.
For a full analytical introduction to Newton's method, see: https://www.youtube.com/watch?v=qvvQ_n-vDys
This course is taught by Jason Bramburger for Concordia University.
More information on the instructor: https://hybrid.concordia.ca/jbrambur/
Follow @jbramburger7 on Twitter for updates.