fitbenchmarking.controllers.nlopt_controller module

Implements a controller for the NLOPT software.

class fitbenchmarking.controllers.nlopt_controller.NLoptController(cost_func)

Bases: fitbenchmarking.controllers.base_controller.Controller

Controller for NLOPT

algorithm_check = {'all': ['LN_BOBYQA', 'LN_NEWUOA', 'LN_NEWUOA_BOUND', 'LN_PRAXIS', 'LD_SLSQP', 'LD_VAR2', 'LD_VAR1', 'AUGLAG', 'AUGLAG_EQ', 'LN_NELDERMEAD', 'LN_SBPLX', 'LN_COBYLA', 'LD_CCSAQ', 'LD_MMA', 'LD_TNEWTON_PRECOND_RESTART', 'LD_TNEWTON_PRECOND', 'LD_TNEWTON_RESTART', 'LD_TNEWTON', 'LD_LBFGS', 'GN_DIRECT', 'GN_DIRECT_L', 'GN_DIRECT_L_RAND', 'GNL_DIRECT_NOSCAL', 'GN_DIRECT_L_NOSCAL', 'GN_DIRECT_L_RAND_NOSCAL', 'GN_ORIG_DIRECT', 'GN_ORIG_DIRECT_L', 'GN_CRS2_LM', 'G_MLSL_LDS', 'G_MLSL', 'GD_STOGO', 'GD_STOGO_RAND', 'GN_AGS', 'GN_ISRES'], 'bfgs': ['LD_LBFGS'], 'conjugate_gradient': ['LN_COBYLA'], 'deriv_free': ['LN_BOBYQA', 'LN_NEWUOA', 'LN_NEWUOA_BOUND', 'LN_PRAXIS'], 'gauss_newton': ['LD_TNEWTON_PRECOND_RESTART', 'LD_TNEWTON_PRECOND', 'LD_TNEWTON_RESTART', 'LD_TNEWTON'], 'general': ['LD_SLSQP', 'LD_VAR2', 'LD_VAR1', 'AUGLAG', 'AUGLAG_EQ'], 'global_optimization': ['GN_DIRECT', 'GN_DIRECT_L', 'GN_DIRECT_L_RAND', 'GNL_DIRECT_NOSCAL', 'GN_DIRECT_L_NOSCAL', 'GN_DIRECT_L_RAND_NOSCAL', 'GN_ORIG_DIRECT', 'GN_ORIG_DIRECT_L', 'GN_CRS2_LM', 'G_MLSL_LDS', 'G_MLSL', 'GD_STOGO', 'GD_STOGO_RAND', 'GN_AGS', 'GN_ISRES'], 'levenberg-marquardt': [], 'ls': [], 'simplex': ['LN_NELDERMEAD', 'LN_SBPLX'], 'steepest_descent': [], 'trust_region': ['LN_COBYLA', 'LD_CCSAQ', 'LD_MMA']}

Within the controller class, you must initialize a dictionary, algorithm_check, such that the keys are given by:

  • all - all minimizers

  • ls - least-squares fitting algorithms

  • deriv_free - derivative free algorithms (these are algorithms that cannot use information about derivatives – e.g., the Simplex method in Mantid)

  • general - minimizers which solve a generic min f(x)

  • simplex - derivative free simplex based algorithms e.g. Nelder-Mead

  • trust_region - algorithms which emply a trust region approach

  • levenberg-marquardt - minimizers that use the Levenberg-Marquardt algorithm

  • gauss_newton - minimizers that use the Gauss Newton algorithm

  • bfgs - minimizers that use the BFGS algorithm

  • conjugate_gradient - Conjugate Gradient algorithms

  • steepest_descent - Steepest Descent algorithms

  • global_optimization - Global Optimization algorithms

The values of the dictionary are given as a list of minimizers for that specific controller that fit into each of the above categories. See for example the GSL controller.

cleanup()

Convert the result to a numpy array and populate the variables results will be read from

fit()

Run problem with NLOPT

jacobian_enabled_solvers = ['LD_SLSQP', 'LD_VAR2', 'LD_VAR1', 'LD_CCSAQ', 'LD_MMA', 'LD_TNEWTON_PRECOND_RESTART', 'LD_TNEWTON_PRECOND', 'LD_TNEWTON_RESTART', 'LD_TNEWTON', 'GD_STOGO', 'GD_STOGO_RAND']

Within the controller class, you must define the list jacobian_enabled_solvers if any of the minimizers for the specific software are able to use jacobian information.

  • jacobian_enabled_solvers: a list of minimizers in a specific

software that allow Jacobian information to be passed into the fitting algorithm

objective_master_nlopt(x, grad)

NLOPT objective function

setup()

Setup problem ready to be run with NLOPT