R Programming/Optimization
Appearance
- optimize() is devoted to one dimensional optimization problem.
- optim(), nlm(), ucminf() (ucminf) can be used for multidimensional optimization problems.
- nlminb() for constrained optimization.
- quadprog, minqa, rgenoud, trust packages
- Some work is done to improve optimization in R. See Updating and improving optim(), Use R 2009 slides[1], the R-forge optimizer page[2] and the corresponding packages including optimx.
Numerical Methods
[edit | edit source]One dimensional problem
[edit | edit source]The one dimensional problem :
> func <- function(x){
+ return ( (x-2)^2 )
+ }
> (func(-2))
[1] 16
>
> # plot your function using the 'curve function'
> curve(func,-4,8)
>
> # Here is another way to plot the function
> # using a grid
> grid <- seq(-10,10,by=.1)
> func(grid)
> plot(grid,func(grid))
>
> # you can find the minimum using the optimize function
> optimize(f=func,interval=c(-10,10))
$minimum
[1] 2
$objective
[1] 0
Newton-Raphson
[edit | edit source]- nlm() provides a Newton algorithm.
- maxLik package for maximization of a likelihood function. This package includes the Newton Raphson method.
- newtonraphson() in the spuRs package.
BFGS
[edit | edit source]- The BFGS method
> func <- function(x){
+ out <- (x[1]-2)^2 + (x[2]-1)^2
+ return <- out
+ }>
> optim(par=c(0,0), fn=func, gr = NULL,
+ method = c("BFGS"),
+ lower = -Inf, upper = Inf,
+ control = list(), hessian = T)
> optim(par=c(0,0), fn=func, gr = NULL,
+ method = c("L-BFGS-B"),
+ lower = -Inf, upper = Inf,
+ control = list(), hessian = T)
Conjugate gradient method
[edit | edit source]optim()
withmethod="cg"
.
Trust Region Method
[edit | edit source]- "trust" package for trust region method
The Nelder-Mead simplex method
[edit | edit source]> func <- function(x){
+ out <- (x[1]-2)^2 + (x[2]-1)^2
+ return <- out
+ }
>
> optim(par=c(0,0), fn=func, gr = NULL,
+ method = c("Nelder-Mead"),
+ lower = -Inf, upper = Inf,
+ control = list(), hessian = T)
- The boot package includes another simplex method
Simulation methods
[edit | edit source]Simulated Annealing
[edit | edit source]- The Simulated Annealing is an algorithm which is useful to maximise non-smooth functions. It is pre implemented in optim().
> func <- function(x){
+ out <- (x[1]-2)^2 + (x[2]-1)^2
+ return <- out
+ }>
> optim(par=c(0,0), fn=func, gr = NULL,
+ method = c("SANN"),
+ lower = -Inf, upper = Inf,
+ control = list(), hessian = T)
EM Algorithm
[edit | edit source]Genetic Algorithm
[edit | edit source] This section is a stub. You can help Wikibooks by expanding it. |
- rgenoud package for genetic algorithm[3]
- gaoptim package for genetic algorithm[4]
- ga general purpose package for optimization using genetic algorithms. It provides a flexible set of tools for implementing genetic algorithms search in both the continuous and discrete case, whether constrained or not. [5]
References
[edit | edit source]Citations
[edit | edit source]- ↑ Updating and improving optim(), Use R 2009 slides http://www.agrocampus-ouest.fr/math/useR-2009/slides/Nash+Varadhan.pdf
- ↑ R-forge optimizer http://optimizer.r-forge.r-project.org/
- ↑ Jasjeet Sekhon homepage : http://sekhon.berkeley.edu/rgenoud/
- ↑ gaoptim on CRAN: http://cran.r-project.org/web/packages/gaoptim/index.html
- ↑ ga on CRAN: http://cran.r-project.org/web/packages/GA/index.html/
Sources
[edit | edit source]- Venables and Ripley, Chapter 16.
- Cameron and Trivedi, Microeconometrics, Chapter 10.
- Braun and Murdoch, first course in statistical programming with R (a very good reference on optimization using R), Chapter 7.