Jump to content

Calculus Optimization Methods

25% developed
From Wikibooks, open books for an open world


A key application of calculus is in optimization: finding maximum and minimum values of a function, and which points realize these extrema.

Context

[edit | edit source]

Formally, the field of mathematical optimization is called mathematical programming, and calculus methods of optimization are basic forms of nonlinear programming. We will primarily discuss finite-dimensional optimization, illustrating with functions in 1 or 2 variables, and algebraically discussing n variables. We will also indicate some extensions to infinite-dimensional optimization, such as calculus of variations, which is a primary application of these methods in physics.

Techniques

[edit | edit source]

Basic techniques include the first and second derivative test, and their higher-dimensional generalizations.

A more advanced technique is Lagrange multipliers, and generalizations as Karush–Kuhn–Tucker conditions and Lagrange multipliers on Banach spaces.

Applications

[edit | edit source]

Optimization, particularly via Lagrange multipliers, is particularly used in the following fields:

Further, several areas of mathematics can be understood as generalizations of these methods, notably Morse theory and calculus of variations.

Terminology

[edit | edit source]
  • Input points, output values
  • Maxima, minima, extrema, optima
  • Stationary point, critical point; stationary value, critical value
  • Objective function
  • Constraints – equality and inequality
    • Especially sublevel sets
    • Feasible region, whose points are candidate solutions

Statement

[edit | edit source]

This tutorial presents an introduction to optimization problems that involve finding a maximum or a minimum value of an objective function subject to a constraint of the form .

Maximum and minimum

[edit | edit source]

Finding optimum values of the function without a constraint is a well known problem dealt with in calculus courses. One would normally use the gradient to find stationary points. Then check all stationary and boundary points to find optimum values.

Example

[edit | edit source]

has one stationary point at (0,0).

The Hessian

[edit | edit source]

A common method of determining whether or not a function has an extreme value at a stationary point is to evaluate the hessian of the function at that point. where the hessian is defined as

Second derivative test

[edit | edit source]

The Second derivative test determines the optimality of stationary point according to the following rules [2]:

  • If at point x then has a local minimum at x
  • If at point x then has a local maximum at x
  • If has negative and positive eigenvalues then x is a saddle point
  • Otherwise the test is inconclusive

In the above example.

Therefore has a minimum at (0,0).

Sections

[edit | edit source]

References

[edit | edit source]
[1] T.K. Moon and W.C. Stirling. Mathematical Methods and Algorithms for Signal Processing. Prentice Hall. 2000.
[2]http://www.ece.tamu.edu/~chmbrlnd/Courses/ECEN601/ECEN601-Chap3.pdf