This section is used to declare the minimizers to use for each fitting software.
Options set in this section will only have an effect if the related software is also set in Fitting Options (either explicitly, or as a default option).
Bumps is a set of data fitting (and Bayesian uncertainty analysis) routines. It came out of the University of Maryland and NIST as part of the DANSE (Distributed Data Analysis of Neutron Scattering Experiments) project.
FitBenchmarking currently supports the Bumps minimizers:
- Nelder-Mead Simplex (
- Levenberg-Marquardt (
- Quasi-Newton BFGS (
- Differential Evolution (
- MINPACK (
mp) This is a translation of MINPACK to Python.
Links GitHub - bumps
The Bumps minimizers are set as follows:
[MINIMIZERS] bumps: amoeba lm-bumps newton de mp
The additional dependency Bumps must be installed for this to be available; See Extra dependencies.
There are two Derivative-Free Optimization packages, DFO-LS and DFO-GN. They are derivative free optimization solvers that were developed by Lindon Roberts at the University of Oxford, in conjunction with NAG. They are particularly well suited for solving noisy problems.
FitBenchmarking currently supports the DFO minimizers:
The DFO minimizers are set as follows:
[MINIMIZERS] dfo: dfols dfogn
Additional dependencies DFO-GN and DFO-LS must be installed for these to be available; See Extra dependencies.
The GSL routines have a number of parameters that need to be chosen, often without default suggestions. We have taken the values as used by Mantid.
- Levenberg-Marquardt (unscaled) (
- Levenberg-Marquardt (scaled) (
- Nelder-Mead Simplex Algorithm (
- Nelder-Mead Simplex Algorithm (version 2) (
- Polak-Ribiere Conjugate Gradient Algorithm (
- Fletcher-Reeves Conjugate-Gradient (
- The vector quasi-Newton BFGS method (
- The vector quasi-Newton BFGS method (version 2) (
- Steepest Descent (
Links SourceForge PyGSL
The GSL minimizers are set as follows:
[MINIMIZERS] gsl: lmsder lmder nmsimplex nmsimplex2 conjugate_pr conjugate_fr vector_bfgs vector_bfgs2 steepest_descent
The external packages GSL and pygsl must be installed to use these minimizers.
Mantid is a framework created to manipulate and analyze neutron scattering and muon spectroscopy data. It has support for a number of minimizers, most of which are from GSL.
- BFGS (
- Conjugate gradient (Fletcher-Reeves) (
Conjugate gradient (Fletcher-Reeves imp.))
- Conjugate gradient (Polak-Ribiere) (
Conjugate gradient (Polak-Ribiere imp.))
- Damped GaussNewton (
- Levenberg-Marquardt algorithm (
- Levenberg-Marquardt MD (
Levenberg-MarquardtMD) - An implementation of Levenberg-Marquardt intended for MD workspaces, where work is divided into chunks to achieve a greater efficiency for a large number of data points.
- Simplex (
- SteepestDescent (
- Trust Region (
Trust Region) - An implementation of one of the algorithms available in RALFit.
The Mantid minimizers are set as follows:
[MINIMIZERS] mantid: BFGS Conjugate gradient (Fletcher-Reeves imp.) Conjugate gradient (Polak-Ribiere imp.) Damped GaussNewton Levenberg-Marquardt Levenberg-MarquardtMD Simplex SteepestDescent Trust Region
The external package Mantid must be installed to use these minimizers.
CERN developed the Minuit package to find the minimum value of a multi-parameter function, and also to compute the uncertainties. We interface via the python interface iminuit with support for the 2.x series.
- Minuit’s MIGRAD (
Links Github - iminuit
The Minuit minimizers are set as follows:
[MINIMIZERS] minuit: minuit
The additional dependency Minuit must be installed for this to be available; See Extra dependencies.
RALFit is a nonlinear least-squares solver, the development of which was funded by the EPSRC grant Least-Squares: Fit for the Future. RALFit is designed to be able to take advantage of higher order derivatives, although only first order derivatives are currently utilized in FitBenchmarking.
- Gauss-Newton, trust region method (
- Hybrid Newton/Gauss-Newton, trust region method (
- Gauss-Newton, regularization (
- Hybrid Newton/Gauss-Newton, regularization (
The RALFit minimizers are set as follows:
[MINIMIZERS] ralfit: gn gn_reg hybrid hybrid_reg
The external package RALFit must be installed to use these minimizers.
SciPy is the standard python package for mathematical software. In particular, we use the minimize solver for general minimization problems from the optimization chapter the SciPy’s library. Currently we only use the algorithms that do not require Hessian information as inputs.
- Nelder-Mead algorithm (
- Powell algorithm (
- Conjugate gradient algorithm (
- BFGS algorithm (
- Newton-CG algorithm (
- L-BFGS-B algorithm (
- Truncated Newton (TNC) algorithm (
- Sequential Least SQuares Programming (
Links Github - SciPy minimize
The SciPy minimizers are set as follows:
[MINIMIZERS] scipy: Nelder-Mead Powell CG BFGS Newton-CG L-BFGS-B TNC SLSQP
SciPy LS (
SciPy is the standard python package for mathematical software. In particular, we use the least_squares solver for Least-Squares minimization problems from the optimization chapter the SciPy’s library.
- Levenberg-Marquardt with supplied Jacobian (
lm-scipy) - a wrapper around MINPACK
- Levenberg-Marquardt with no Jacobian passed (
lm-scipy-no-jac) - as above, but using MINPACK’s approximate Jacobian
- The Trust Region Reflective algorithm (
- A dogleg algorithm with rectangular trust regions (
The SciPy least squares minimizers are set as follows:
[MINIMIZERS] scipy_ls: lm-scipy-no-jac lm-scipy trf dogbox