# Minimizer Options¶

This section is used to declare the minimizers to use for each fitting software.

Warning

Options set in this section will only have an effect if the related software is also set in Fitting Options (either explicitly, or as a default option).

## Bumps (`bumps`

)¶

Bumps is a set of data fitting (and Bayesian uncertainty analysis) routines.
It came out of the University of Maryland and NIST as part of the DANSE
(*Distributed Data Analysis of Neutron Scattering Experiments*) project.

FitBenchmarking currently supports the Bumps minimizers:

- Nelder-Mead Simplex (
`amoeba`

) - Levenberg-Marquardt (
`lm`

) - Quasi-Newton BFGS (
`newton`

) - Differential Evolution (
`de`

) - MINPACK (
`mp`

) This is a translation of MINPACK to Python.

**Links** GitHub - bumps

The Bumps minimizers are set as follows:

```
[MINIMIZERS]
bumps: amoeba
lm-bumps
newton
de
mp
```

Warning

The additional dependency Bumps must be installed for this to be available; See Extra dependencies.

## DFO (`dfo`

)¶

There are two Derivative-Free Optimization packages, DFO-LS and DFO-GN. They are derivative free optimization solvers that were developed by Lindon Roberts at the University of Oxford, in conjunction with NAG. They are particularly well suited for solving noisy problems.

FitBenchmarking currently supports the DFO minimizers:

The DFO minimizers are set as follows:

```
[MINIMIZERS]
dfo: dfols
dfogn
```

Warning

Additional dependencies DFO-GN and DFO-LS must be installed for these to be available; See Extra dependencies.

## GSL (`gsl`

)¶

The GNU Scientific Library is a numerical library that provides a wide range of mathematical routines. We call GSL using the pyGSL Python interface.

The GSL routines have a number of parameters that need to be chosen, often without default suggestions. We have taken the values as used by Mantid.

We provide implementations for the following packages in the multiminimize and multifit sections of the library:

- Levenberg-Marquardt (unscaled) (
`lmder`

) - Levenberg-Marquardt (scaled) (
`lmsder`

) - Nelder-Mead Simplex Algorithm (
`nmsimplex`

) - Nelder-Mead Simplex Algorithm (version 2) (
`nmsimplex2`

) - Polak-Ribiere Conjugate Gradient Algorithm (
`conjugate_pr`

) - Fletcher-Reeves Conjugate-Gradient (
`conjugate_fr`

) - The vector quasi-Newton BFGS method (
`vector_bfgs`

) - The vector quasi-Newton BFGS method (version 2) (
`vector_bfgs2`

) - Steepest Descent (
`steepest_descent`

)

**Links** SourceForge PyGSL

The GSL minimizers are set as follows:

```
[MINIMIZERS]
gsl: lmsder
lmder
nmsimplex
nmsimplex2
conjugate_pr
conjugate_fr
vector_bfgs
vector_bfgs2
steepest_descent
```

Warning

The external packages GSL and pygsl must be installed to use these minimizers.

## Mantid (`mantid`

)¶

Mantid is a framework created to manipulate and analyze neutron scattering and muon spectroscopy data. It has support for a number of minimizers, most of which are from GSL.

- BFGS (
`BFGS`

) - Conjugate gradient (Fletcher-Reeves) (
`Conjugate gradient (Fletcher-Reeves imp.)`

) - Conjugate gradient (Polak-Ribiere) (
`Conjugate gradient (Polak-Ribiere imp.)`

) - Damped GaussNewton (
`Damped GaussNewton`

) - Levenberg-Marquardt algorithm (
`Levenberg-Marquardt`

) - Levenberg-Marquardt MD (
`Levenberg-MarquardtMD`

) - An implementation of Levenberg-Marquardt intended for MD workspaces, where work is divided into chunks to achieve a greater efficiency for a large number of data points. - Simplex (
`simplex`

) - SteepestDescent (
`SteepestDescent`

) - Trust Region (
`Trust Region`

) - An implementation of one of the algorithms available in RALFit.

The Mantid minimizers are set as follows:

```
[MINIMIZERS]
mantid: BFGS
Conjugate gradient (Fletcher-Reeves imp.)
Conjugate gradient (Polak-Ribiere imp.)
Damped GaussNewton
Levenberg-Marquardt
Levenberg-MarquardtMD
Simplex
SteepestDescent
Trust Region
```

Warning

The external package Mantid must be installed to use these minimizers.

## Minuit (`minuit`

)¶

CERN developed the Minuit package to find the minimum value of a multi-parameter function, and also to compute the uncertainties. We interface via the python interface iminuit with support for the 2.x series.

- Minuit’s MIGRAD (
`minuit`

)

**Links** Github - iminuit

The Minuit minimizers are set as follows:

```
[MINIMIZERS]
minuit: minuit
```

Warning

The additional dependency Minuit must be installed for this to be available; See Extra dependencies.

## RALFit (`ralfit`

)¶

RALFit is a nonlinear least-squares solver, the development of which was funded by the EPSRC grant Least-Squares: Fit for the Future. RALFit is designed to be able to take advantage of higher order derivatives, although only first order derivatives are currently utilized in FitBenchmarking.

- Gauss-Newton, trust region method (
`gn`

) - Hybrid Newton/Gauss-Newton, trust region method (
`hybrid`

) - Gauss-Newton, regularization (
`gn_reg`

) - Hybrid Newton/Gauss-Newton, regularization (
`hybrid_reg`

)

**Links** Github - RALFit. RALFit’s Documentation on: Gauss-Newton/Hybrid models, the trust region method and The regularization method

The RALFit minimizers are set as follows:

```
[MINIMIZERS]
ralfit: gn
gn_reg
hybrid
hybrid_reg
```

Warning

The external package RALFit must be installed to use these minimizers.

## SciPy (`scipy`

)¶

SciPy is the standard python package for mathematical software. In particular, we use the minimize solver for general minimization problems from the optimization chapter the SciPy’s library. Currently we only use the algorithms that do not require Hessian information as inputs.

- Nelder-Mead algorithm (
`Nelder-Mead`

) - Powell algorithm (
`Powell`

) - Conjugate gradient algorithm (
`CG`

) - BFGS algorithm (
`BFGS`

) - Newton-CG algorithm (
`Newton-CG`

) - L-BFGS-B algorithm (
`L-BFGS-B`

) - Truncated Newton (TNC) algorithm (
`TNC`

) - Sequential Least SQuares Programming (
`SLSQP`

)

**Links** Github - SciPy minimize

The SciPy minimizers are set as follows:

```
[MINIMIZERS]
scipy: Nelder-Mead
Powell
CG
BFGS
Newton-CG
L-BFGS-B
TNC
SLSQP
```

## SciPy LS (`scipy_ls`

)¶

SciPy is the standard python package for mathematical software. In particular, we use the least_squares solver for Least-Squares minimization problems from the optimization chapter the SciPy’s library.

- Levenberg-Marquardt with supplied Jacobian (
`lm-scipy`

) - a wrapper around MINPACK - Levenberg-Marquardt with no Jacobian passed (
`lm-scipy-no-jac`

) - as above, but using MINPACK’s approximate Jacobian - The Trust Region Reflective algorithm (
`trf`

) - A dogleg algorithm with rectangular trust regions (
`dogbox`

)

**Links** Github - SciPy least_squares

The SciPy least squares minimizers are set as follows:

```
[MINIMIZERS]
scipy_ls: lm-scipy-no-jac
lm-scipy
trf
dogbox
```