fitbenchmarking.controllers.mantid_controller module

Implements a controller for the Mantid fitting software.

class fitbenchmarking.controllers.mantid_controller.MantidController(cost_func)

Bases: fitbenchmarking.controllers.base_controller.Controller

Controller for the Mantid fitting software.

Mantid requires subscribing a custom function in a predefined format, so this controller creates that in setup.

COST_FUNCTION_MAP = {'nlls': 'Unweighted least squares', 'poisson': 'Poisson', 'weighted_nlls': 'Least squares'}

A map from fitbenchmarking cost functions to mantid ones.

algorithm_check = {'all': ['BFGS', 'Conjugate gradient (Fletcher-Reeves imp.)', 'Conjugate gradient (Polak-Ribiere imp.)', 'Damped GaussNewton', 'Levenberg-Marquardt', 'Levenberg-MarquardtMD', 'Simplex', 'SteepestDescent', 'Trust Region', 'FABADA'], 'bfgs': ['BFGS'], 'conjugate_gradient': ['Conjugate gradient (Fletcher-Reeves imp.)', 'Conjugate gradient (Polak-Ribiere imp.)'], 'deriv_free': ['Simplex', 'FABADA'], 'gauss_newton': ['Damped GaussNewton'], 'general': ['BFGS', 'Conjugate gradient (Fletcher-Reeves imp.)', 'Conjugate gradient (Polak-Ribiere imp.)', 'Damped GaussNewton', 'Simplex', 'SteepestDescent'], 'global_optimization': ['FABADA'], 'levenberg-marquardt': ['Levenberg-Marquardt', 'Levenberg-MarquardtMD'], 'ls': ['Levenberg-Marquardt', 'Levenberg-MarquardtMD', 'Trust Region', 'FABADA'], 'simplex': ['Simplex'], 'steepest_descent': ['SteepestDescent'], 'trust_region': ['Trust Region', 'Levenberg-Marquardt', 'Levenberg-MarquardtMD']}

Within the controller class, you must initialize a dictionary, algorithm_check, such that the keys are given by:

  • all - all minimizers

  • ls - least-squares fitting algorithms

  • deriv_free - derivative free algorithms (these are algorithms that cannot use information about derivatives – e.g., the Simplex method in Mantid)

  • general - minimizers which solve a generic min f(x)

  • simplex - derivative free simplex based algorithms e.g. Nelder-Mead

  • trust_region - algorithms which emply a trust region approach

  • levenberg-marquardt - minimizers that use the Levenberg-Marquardt algorithm

  • gauss_newton - minimizers that use the Gauss Newton algorithm

  • bfgs - minimizers that use the BFGS algorithm

  • conjugate_gradient - Conjugate Gradient algorithms

  • steepest_descent - Steepest Descent algorithms

  • global_optimization - Global Optimization algorithms

The values of the dictionary are given as a list of minimizers for that specific controller that fit into each of the above categories. See for example the GSL controller.


Convert the result to a numpy array and populate the variables results will be read from.

eval_chisq(params, x=None, y=None, e=None)

Computes the chisq value. If multi-fit inputs will be lists and this will return a list of chi squared of params[i], x[i], y[i], and e[i].

  • params (list of float or list of list of float) – The parameters to calculate residuals for

  • x (numpy array or list of numpy arrays, optional) – x data points, defaults to self.data_x

  • y (numpy array or list of numpy arrays, optional) – y data points, defaults to self.data_y

  • e (numpy array or list of numpy arrays, optional) – error at each data point, defaults to self.data_e


The sum of squares of residuals for the datapoints at the given parameters

Return type

numpy array


Run problem with Mantid.

jacobian_enabled_solvers = ['BFGS', 'Conjugate gradient (Fletcher-Reeves imp.)', 'Conjugate gradient (Polak-Ribiere imp.)', 'Damped GaussNewton', 'Levenberg-Marquardt', 'Levenberg-MarquardtMD', 'SteepestDescent', 'Trust Region']

Within the controller class, you must define the list jacobian_enabled_solvers if any of the minimizers for the specific software are able to use jacobian information.

  • jacobian_enabled_solvers: a list of minimizers in a specific

software that allow Jacobian information to be passed into the fitting algorithm


Setup problem ready to run with Mantid.

Adds a custom function to Mantid for calling in fit().