Cauchy-Schwarz inequality and optimization 1
Between Cauchy and Schwarz there was also Bunyakovsky who contributed to this inequality.
Inequalities, if they are precise, can be used for optimization, even in the infinite-dimensional case. The method explained here does not rely on the Lagrange method or the Kuhn-Tucker theorem or convexity. It illustrates the point that it's not a good idea to try fit all problems into one framework. The solution method should depend on the problem.
Problem. Let be fixed numbers and let
be variables satisfying the restriction
(1)
The problem is to maximize the weighted sum subject to the above restriction. This can be done using the convexity method. However, our future application will contain an infinite number of variables in which case the convexity method becomes conceptually complex. Therefore here we provide a different method based on the Cauchy-Schwarz inequality. In our case it looks like this
(2)
We solve the problem in two steps. First we bound from above and then we find the point
at which the upper bound is attained.
Step 1. (1) and (2) obviously imply
(3)
Step 2. We need to find such that (1) is satisfied and (3) turns into an equality. This involves a little guessing. The guiding idea is that on the right in (3) we have squares of
For them to appear at the left, we choose
with some constant
. Then we want
to be true. This gives the value of the constant The point we are looking for becomes
(4) It satisfies (1):
Further, the maximized value of is
.
Leave a Reply
You must be logged in to post a comment.