fitbenchmarking.controllers.scipy_controller module

Implements a controller for the scipy fitting software. In particular, here for the scipy minimize solver for general minimization problems.

class fitbenchmarking.controllers.scipy_controller.ScipyController(cost_func)

Bases: fitbenchmarking.controllers.base_controller.Controller

Controller for the Scipy fitting software.

algorithm_check = {'all': ['Nelder-Mead', 'Powell', 'CG', 'BFGS', 'Newton-CG', 'L-BFGS-B', 'TNC', 'SLSQP', 'COBYLA', 'trust-ncg', 'trust-exact', 'trust-krylov', 'trust-constr', 'dogleg'], 'bfgs': ['BFGS', 'L-BFGS-B'], 'conjugate_gradient': ['CG', 'Newton-CG', 'Powell'], 'deriv_free': ['Nelder-Mead', 'Powell', 'COBYLA'], 'gauss_newton': [], 'general': ['Nelder-Mead', 'Powell', 'CG', 'BFGS', 'Newton-CG', 'L-BFGS-B', 'TNC', 'SLSQP'], 'global_optimization': [], 'levenberg-marquardt': [], 'ls': [None], 'simplex': ['Nelder-Mead'], 'steepest_descent': [], 'trust_region': ['trust-ncg', 'trust-exact', 'trust-krylov', 'trust-constr', 'dogleg']}

Within the controller class, you must initialize a dictionary, algorithm_check, such that the keys are given by:

  • all - all minimizers

  • ls - least-squares fitting algorithms

  • deriv_free - derivative free algorithms (these are algorithms that cannot use information about derivatives – e.g., the Simplex method in Mantid)

  • general - minimizers which solve a generic min f(x)

  • simplex - derivative free simplex based algorithms e.g. Nelder-Mead

  • trust_region - algorithms which emply a trust region approach

  • levenberg-marquardt - minimizers that use the Levenberg-Marquardt algorithm

  • gauss_newton - minimizers that use the Gauss Newton algorithm

  • bfgs - minimizers that use the BFGS algorithm

  • conjugate_gradient - Conjugate Gradient algorithms

  • steepest_descent - Steepest Descent algorithms

  • global_optimization - Global Optimization algorithms

The values of the dictionary are given as a list of minimizers for that specific controller that fit into each of the above categories. See for example the GSL controller.

cleanup()

Convert the result to a numpy array and populate the variables results will be read from.

fit()

Run problem with Scipy.

hessian_enabled_solvers = ['Newton-CG', 'trust-ncg', 'trust-exact', 'trust-krylov', 'trust-constr', 'dogleg']

Within the controller class, you must define the list hessian_enabled_solvers if any of the minimizers for the specific software are able to use hessian information.

  • hessian_enabled_solvers: a list of minimizers in a specific

software that allow Hessian information to be passed into the fitting algorithm

jacobian_enabled_solvers = ['CG', 'BFGS', 'Newton-CG', 'L-BFGS-B', 'TNC', 'SLSQP', 'trust-ncg', 'trust-exact', 'trust-krylov', 'trust-constr', 'dogleg']

Within the controller class, you must define the list jacobian_enabled_solvers if any of the minimizers for the specific software are able to use jacobian information.

  • jacobian_enabled_solvers: a list of minimizers in a specific

software that allow Jacobian information to be passed into the fitting algorithm

setup()

Setup problem ready to be run with SciPy