fitbenchmarking.controllers.gradient_free_controller module

Implements a controller for Gradient Free Optimizers

class fitbenchmarking.controllers.gradient_free_controller.GradientFreeController(cost_func)

Bases: fitbenchmarking.controllers.base_controller.Controller

Controller for the Gradient Free Optimizers fitting software.

algorithm_check = {'all': ['HillClimbingOptimizer', 'RepulsingHillClimbingOptimizer', 'SimulatedAnnealingOptimizer', 'RandomSearchOptimizer', 'RandomRestartHillClimbingOptimizer', 'RandomAnnealingOptimizer', 'ParallelTemperingOptimizer', 'ParticleSwarmOptimizer', 'EvolutionStrategyOptimizer', 'BayesianOptimizer', 'TreeStructuredParzenEstimators', 'DecisionTreeOptimizer'], 'bfgs': [], 'conjugate_gradient': [], 'deriv_free': ['HillClimbingOptimizer', 'RepulsingHillClimbingOptimizer', 'SimulatedAnnealingOptimizer', 'RandomSearchOptimizer', 'RandomRestartHillClimbingOptimizer', 'RandomAnnealingOptimizer', 'ParallelTemperingOptimizer', 'ParticleSwarmOptimizer', 'EvolutionStrategyOptimizer', 'BayesianOptimizer', 'TreeStructuredParzenEstimators', 'DecisionTreeOptimizer'], 'gauss_newton': [], 'general': ['HillClimbingOptimizer', 'RepulsingHillClimbingOptimizer', 'SimulatedAnnealingOptimizer', 'RandomSearchOptimizer', 'RandomRestartHillClimbingOptimizer', 'RandomAnnealingOptimizer', 'ParallelTemperingOptimizer', 'ParticleSwarmOptimizer', 'EvolutionStrategyOptimizer', 'BayesianOptimizer', 'TreeStructuredParzenEstimators', 'DecisionTreeOptimizer'], 'global_optimization': ['HillClimbingOptimizer', 'RepulsingHillClimbingOptimizer', 'SimulatedAnnealingOptimizer', 'RandomSearchOptimizer', 'RandomRestartHillClimbingOptimizer', 'RandomAnnealingOptimizer', 'ParallelTemperingOptimizer', 'ParticleSwarmOptimizer', 'EvolutionStrategyOptimizer', 'BayesianOptimizer', 'TreeStructuredParzenEstimators', 'DecisionTreeOptimizer'], 'levenberg-marquardt': [], 'ls': [], 'simplex': [], 'steepest_descent': [], 'trust_region': []}

Within the controller class, you must initialize a dictionary, algorithm_check, such that the keys are given by:

  • all - all minimizers

  • ls - least-squares fitting algorithms

  • deriv_free - derivative free algorithms (these are algorithms that cannot use information about derivatives – e.g., the Simplex method in Mantid)

  • general - minimizers which solve a generic min f(x)

  • simplex - derivative free simplex based algorithms e.g. Nelder-Mead

  • trust_region - algorithms which emply a trust region approach

  • levenberg-marquardt - minimizers that use the Levenberg-Marquardt algorithm

  • gauss_newton - minimizers that use the Gauss Newton algorithm

  • bfgs - minimizers that use the BFGS algorithm

  • conjugate_gradient - Conjugate Gradient algorithms

  • steepest_descent - Steepest Descent algorithms

  • global_optimization - Global Optimization algorithms

The values of the dictionary are given as a list of minimizers for that specific controller that fit into each of the above categories. See for example the GSL controller.

cleanup()

Convert the result to a numpy array and populate the variables results will be read from.

controller_name = 'gradient_free'

A name to be used in tables. If this is set to None it will be inferred from the class name.

fit()

Run problem with Gradient Free Optimizers

setup()

Setup for Gradient Free Optimizers