Calculus Optimization Methods/Lagrange Multipliers

From Wikibooks, open books for an open world
< Calculus Optimization Methods(Redirected from Calculus optimization methods/Lagrange multipliers)
Jump to: navigation, search

The method of Lagrange multipliers solves the constrained optimization problem by transforming it into a non-constrained optimization problem of the form:

  • \operatorname{\mathcal{L}}(x_1,x_2,\ldots, x_n,\lambda)= \operatorname{f}(x_1,x_2,\ldots, x_n)+\operatorname{\lambda}(k-g(x_1,x_2,\ldots, x_n))

Then finding the gradient and hessian as was done above will determine any optimum values of \operatorname{\mathcal{L}}(x_1,x_2,\ldots, x_n,\lambda).

Suppose we now want to find optimum values for f(x,y)=2x^2+y^2 subject to x+y=1 from [2].

Then the Lagrangian method will result in a non-constrained function.

  • \operatorname{\mathcal{L}}(x,y,\lambda)= 2x^2+y^2+\lambda (1-x-y)

The gradient for this new function is

  • \frac{\partial \mathcal{L}}{\partial x}(x,y,\lambda)= 4x+\lambda (-1)=0
  • \frac{\partial \mathcal{L}}{\partial y}(x,y,\lambda)= 2y+\lambda (-1)=0
  • \frac{\partial \mathcal{L}}{\partial \lambda}(x,y,\lambda)=1-x-y=0

Finding the stationary points of the above equations can be obtained from their matrix from.

4 & 0 & -1 \\
0& 2 & -1 \\
-1 & -1 & 0
\end{bmatrix} \begin{bmatrix}
y \\
\lambda \end{bmatrix}= \begin{bmatrix}

This results in x=1/3, y=2/3, \lambda=4/3.

Next we can use the hessian as before to determine the type of this stationary point.

4 & 0 & -1 \\
0& 2 & -1 \\

Since  H(\mathcal{L}) >0 then the solution (1/3,2/3,4/3) minimizes f(x,y)=2x^2+y^2 subject to x+y=1 with f(x,y)=2/3.