Lagrange method: sufficient conditions
From what we know about unconstrained optimization, we expect that somehow the matrix of second derivatives should play a role. To get there, we need to differentiate twice the objective function with the constraint incorporated.
Summary on necessary condition
(1) The problem is to maximize subject to
Everywhere we impose the implicit function existence condition:
(2)
Differentiation of the restriction gives
(3)
Let be an extremum point for (1). Then, as we proved, there exists
such that the Lagrangian
satisfies FOC's:
(4)
Also we need the function with the constraint built into it.
Heading to sufficient conditions
We need to check the sign of the second derivative of :
(5)
Differentiating (3) once again gives
(6)
Since we need to obtain the Lagrangian, let us multiply (6) by and add the result to (5):
(7)
Here because of (4).
Denote
Then (7) rewrites as
(8)
This is a quadratic form of the Hessian of (no differentiation with respect to
).
Rough sufficient condition. If we require the Hessian to be positive definite, then for any
and, in particular, for
. Thus, positive definiteness of
, together with the FOC (3), will be sufficient for
to have a minimum.
Refined sufficient condition. We can relax the condition by reducing the set of on which
should be positive. Note from (3) that
belongs to the set
. Using (2), for
we can write
This means that
is a straight line. Requiring
for any nonzero
we have positivity of (8) for
We summarize our findings as follows:
Theorem. Assume the implicit function existence condition and consider a critical point for the Lagrangian (that satisfies FOC's). a) If at that point for any nonzero
, then that point is a minimum point. b) If at that point
for any nonzero
, then that point is a maximum point.
Leave a Reply
You must be logged in to post a comment.