We do four examples of classifying critical points for scalar-valued functions of multiple variables. In each, we compute the gradient to find the critical points, and then analyze the Hessian matrix at the those critical points.
We know that if our Hessian corresponds to a positive definite quadratic form, then our CP is a local min; a negative definite quadratic form, then our CP is a local max; and an infinite quadratic form, then our CP is a saddle. We use the eigenvalues (or rather, just their signs) of the Hessian to see if any of these cases apply.
For a little more challenge, we look at two examples where the Hessian is semidefinite (so our test is inconclusive), and I give you some tips for how to tackle this situation.
We wrap up with a function from R^3 to R and use Sylvester’s criterion to classify the Hessian without computing eigenvalues. I take a moment to explain how the criterion works, how to remember the sign pattern for definiteness. Then we apply it to the final example and see that the Hessian is indefinite, so we have a saddle.
The main takeaway here is that classifying critical points isn’t always about computing eigenvalues—it’s can involve some creative thinking. Additionally I also show you ways to avoid explicitly computing the precise eigenvalues: the trace-determinant method for 2x2 matrices, as well as Sylvester's Criterion (for corner minors) for general nxn matrices.
#mathematics #math #optimization #hessianmatrix #eigenvalues #quadraticforms #linearalgebra #realanalysis #calculus3 #multivariablecalculus #secondderivativetest