Lagrange method: sufficient conditions
From what we know about unconstrained optimization, we expect that somehow the matrix of second derivatives should play a role. To get there, we need to differentiate twice the objective function with the constraint incorporated.
Summary on necessary condition
(1) The problem is to maximize subject to
Everywhere we impose the implicit function existence condition:
Let be an extremum point for (1). Then, as we proved, there exists such that the Lagrangian satisfies FOC's:
Also we need the function with the constraint built into it.
Heading to sufficient conditions
We need to check the sign of the second derivative of :
Differentiating (3) once again gives
Since we need to obtain the Lagrangian, let us multiply (6) by and add the result to (5):
Here because of (4).
Then (7) rewrites as
This is a quadratic form of the Hessian of (no differentiation with respect to ).
Rough sufficient condition. If we require the Hessian to be positive definite, then for any and, in particular, for . Thus, positive definiteness of , together with the FOC (3), will be sufficient for to have a minimum.
Refined sufficient condition. We can relax the condition by reducing the set of on which should be positive. Note from (3) that belongs to the set . Using (2), for we can write This means that is a straight line. Requiring for any nonzero we have positivity of (8) for We summarize our findings as follows:
Theorem. Assume the implicit function existence condition and consider a critical point for the Lagrangian (that satisfies FOC's). a) If at that point for any nonzero , then that point is a minimum point. b) If at that point for any nonzero , then that point is a maximum point.