UNCMIN
Unconstrained Minimization
UNCMIN
is a FORTRAN77 library which
seeks a solution of the unconstrained minimization problem.
UNCMIN is a package which seeks to minimize a scalar function of
N variables. There cannot be any side conditions or constraints, such
as requiring that the variables be positive. Only a local minimizer is
sought. There may be other, better minimizers which are not found. This
depends on the starting point and tolerances.
UNCMIN may also be used to seek the solution of N nonlinear equations, by
constructing the function to minimize as the sum of squares of the residuals.
Again, because only a local minimizer is sought, it is not guaranteed that the
answer returned will be a solution to the equations.
The program requires values of the gradient and the hessian. These may be
computed in subroutines that the user supplies, or else the program may be
requested to approximate these quantities using finite differences.
Given a scalar function F(X1 ,X2 ,...,XN ), the gradient is the N-vector
(DF/DX1 ,DF/DX2 ,...,DF/DXN ).
The hessian is the N by N matrix of second partial derivatives, whose I,J-th
entry is D2 F/DXI DXJ. Note that this means the hessian is symmetric.
Three global strategies have been implemented in this package:
-
line search, which is the default method,
-
the dogleg trust region method,
-
the hookstep trust region method of More-Hebdon.
The relative performance of these methods will vary from problem to problem.
There are no a priori means of knowing which method will perform better in a
given case. Consequently, if the user is solving a class of problems, it may
pay to sample each method in turn and choose the one which works best.
The program has three levels of output, and can direct that output to the
terminal or log file, or to a disk file. The three levels are:
-
No output from the program whatsoever.
-
Information about the starting and end points, and any errors.
-
Extensive information about the entire process.
There are both a simple called OPTIF0 and a sophisticated interface
call OPTIF9.
The simple interface OPTIF0 requires that the user supply only:
-
the number of variables N;
-
a starting point or rough solution X;
-
a subroutine to evaluate F;
-
workspace;
The sophisticated interface OPTIF9 requires all the above, plus:
-
information about whether the gradient will be supplied by the user,
or approximated.
-
information about whether the hessian will be supplied by the user,
or approximated.
-
scale factors for the solutions X and the function F.
-
the choice of method to use.
-
whether the function F is expensive to evaluate or not.
-
an estimate of the accuracy of F.
-
the maximum number of steps to take.
-
the amount of internally generated output desired.
-
various error and stepsize tolerances.
If the user is only interested in a few of the arguments to OPTIF9,
subroutine DFAULT may be called first, which sets all variables to default
values, and then reset only a few of those variables before calling OPTIF9.
Languages:
UNCMIN is available in
a FORTRAN77 version.
Related Data and Programs:
ASA047,
a FORTRAN77 library which
minimizes a scalar function of several variables using the Nelder-Mead algorithm.
COMPASS_SEARCH,
a FORTRAN90 library which
seeks the minimizer of a scalar function of several variables
using compass search, a direct search algorithm that does not use derivatives.
DQED,
a FORTRAN90 library which
solves constrained least squares problems.
ENTRUST,
a MATLAB program which
solves problems in scalar optimization or nonlinear least squares.
MINPACK,
a FORTRAN90 library which
solves systems of nonlinear equations, or the least squares minimization of the
residual of a set of linear or nonlinear equations.
NELDER_MEAD,
a MATLAB program which
minimizes a scalar function of several variables using the Nelder-Mead algorithm.
NL2SOL,
a FORTRAN90 library which
implements an adaptive nonlinear least-squares algorithm.
PRAXIS,
a FORTRAN90 routine which
minimizes a scalar function of several variables.
TEST_OPT,
a FORTRAN90 library which
defines test problems in scalar optimization.
TOMS178,
a FORTRAN77 library which
optimizes a scalar functional of multiple variables using the Hooke-Jeeves method.
TOMS611,
a FORTRAN77 library which
solves problems in unconstrained minimization.
Author:
Robert Schnabel, John Koontz, Barry Weiss
Reference:
-
John Dennis, Robert Schnabel,
Numerical Methods for Unconstrained Optimization
and Nonlinear Equations,
SIAM, 1996,
ISBN13: 978-0-898713-64-0,
LC: QA402.5.D44.
-
Jorge More, Burton Garbow, Kenneth Hillstrom,
Testing unconstrained optimization software,
ACM Transactions on Mathematical Software,
Volume 7, Number 1, March 1981, pages 17-41.
-
Jorge More, Burton Garbow, Kenneth Hillstrom,
Algorithm 566:
FORTRAN Subroutines for Testing unconstrained optimization software,
ACM Transactions on Mathematical Software,
Volume 7, Number 1, March 1981, pages 136-140.
-
Robert Schnabel, John Koontz, Barry Weiss,
A modular system of algorithms for unconstrained minimization,
Technical Report CU-CS-240-82,
Computer Science Department,
University of Colorado at Boulder, 1982.
Source Code:
Examples and Tests:
List of Routines:
-
BAKSLV solves A'*X=B, A lower triangular.
-
CHLHSN Cholesky decomposes the perturbed model Hessian matrix.
-
CHOLDC finds the perturbed Cholesky decomposition of A+D.
-
D1FCN is a dummy routine for the analytic gradient.
-
D2FCN is a dummy routine for the analytic Hessian.
-
DFAULT sets default values for each input variable to the minimization package.
-
DOGDRV finds the next Newton iterate by the double dogleg method.
-
DOGSTP finds the next double dogleg stepsize.
-
EXPLAIN prints an explanation for the UNCMIN termination code.
-
FORSLV solves A*X=B, where A is lower triangular.
-
FSTOCD finds central difference approximation to the gradient.
-
FSTOFD finds forward difference approximation to the gradient.
-
GRDCHK compares the analytic gradient against a finite difference estimate.
-
HESCHK compares the analytic Hessian against a finite difference estimate.
-
HOOKDR finds the next Newton iterate by the More-Hebdon method.
-
HOOKST finds the More-Hebdon stepsize.
-
HSNINT approximates the initial Hessian by secant updates.
-
LLTSLV solves A*X=B when A has been Cholesky factored.
-
LNSRCH finds the next Newton iterate by line search.
-
MVMLTL computes Y = L * X, where L is lower triangular.
-
MVMLTS computes Y = A * X where A is in symmetric storage.
-
MVMLTU computes Y = L' * X, where L is lower triangular.
-
OPTCHK checks the input for reasonableness.
-
OPTDRV is a driver for the nonlinear optimization code.
-
OPTIF0 provides the simplest interface to the optimization package.
-
OPTIF9 provides a complete interface to the minimization package.
-
OPTSTP checks the optimization stopping criteria.
-
QRAUX1 interchanges two rows of an upper Hessenberg matrix.
-
QRAUX2 premultiplies a matrix by a Jacobi rotation.
-
QRUPDT updates a QR factorization.
-
RESULT prints information from the optimization procedure.
-
SCLMUL multiplies a vector by a scalar.
-
SDOT returns the dot product of two vectors.
-
SECFAC updates the Hessian matrix by the BFGS factored method.
-
SECUNF updates the Hessian by the BFGS unfactored method.
-
SNDOFD finds a second order approximation to the Hessian.
-
SNRM2 returns the Euclidean norm of a vector.
-
TREGUP accepts the next iterate and updates the trust region.
You can go up one level to
the FORTRAN77 source codes.
Last revised on 19 February 2008.