fitbenchmarking.controllers.mantid_controller module
Implements a controller for the Mantid fitting software.
- class fitbenchmarking.controllers.mantid_controller.MantidController(cost_func)
Bases:
Controller
Controller for the Mantid fitting software.
Mantid requires subscribing a custom function in a predefined format, so this controller creates that in setup.
- COST_FUNCTION_MAP = {'nlls': 'Unweighted least squares', 'poisson': 'Poisson', 'weighted_nlls': 'Least squares'}
A map from fitbenchmarking cost functions to mantid ones.
- algorithm_check = {'MCMC': [], 'all': ['BFGS', 'Conjugate gradient (Fletcher-Reeves imp.)', 'Conjugate gradient (Polak-Ribiere imp.)', 'Damped GaussNewton', 'Levenberg-Marquardt', 'Levenberg-MarquardtMD', 'Simplex', 'SteepestDescent', 'Trust Region', 'FABADA'], 'bfgs': ['BFGS'], 'conjugate_gradient': ['Conjugate gradient (Fletcher-Reeves imp.)', 'Conjugate gradient (Polak-Ribiere imp.)'], 'deriv_free': ['Simplex', 'FABADA'], 'gauss_newton': ['Damped GaussNewton'], 'general': ['BFGS', 'Conjugate gradient (Fletcher-Reeves imp.)', 'Conjugate gradient (Polak-Ribiere imp.)', 'Damped GaussNewton', 'Simplex', 'SteepestDescent'], 'global_optimization': ['FABADA'], 'levenberg-marquardt': ['Levenberg-Marquardt', 'Levenberg-MarquardtMD'], 'ls': ['Levenberg-Marquardt', 'Levenberg-MarquardtMD', 'Trust Region', 'FABADA'], 'simplex': ['Simplex'], 'steepest_descent': ['SteepestDescent'], 'trust_region': ['Trust Region', 'Levenberg-Marquardt', 'Levenberg-MarquardtMD']}
Within the controller class, you must initialize a dictionary,
algorithm_check
, such that the keys are given by:all
- all minimizersls
- least-squares fitting algorithmsderiv_free
- derivative free algorithms (these are algorithms that cannot use information about derivatives – e.g., theSimplex
method inMantid
)general
- minimizers which solve a generic min f(x)simplex
- derivative free simplex based algorithms e.g. Nelder-Meadtrust_region
- algorithms which emply a trust region approachlevenberg-marquardt
- minimizers that use the Levenberg-Marquardt algorithmgauss_newton
- minimizers that use the Gauss Newton algorithmbfgs
- minimizers that use the BFGS algorithmconjugate_gradient
- Conjugate Gradient algorithmssteepest_descent
- Steepest Descent algorithmsglobal_optimization
- Global Optimization algorithmsMCMC
- Markov Chain Monte Carlo algorithms
The values of the dictionary are given as a list of minimizers for that specific controller that fit into each of the above categories. See for example the
GSL
controller.
- cleanup()
Convert the result to a numpy array and populate the variables results will be read from.
- eval_chisq(params, x=None, y=None, e=None)
Computes the chisq value. If multi-fit inputs will be lists and this will return a list of chi squared of params[i], x[i], y[i], and e[i].
- Parameters:
params (list of float or list of list of float) – The parameters to calculate residuals for
x (numpy array or list of numpy arrays, optional) – x data points, defaults to self.data_x
y (numpy array or list of numpy arrays, optional) – y data points, defaults to self.data_y
e (numpy array or list of numpy arrays, optional) – error at each data point, defaults to self.data_e
- Returns:
The sum of squares of residuals for the datapoints at the given parameters
- Return type:
numpy array
- fit()
Run problem with Mantid.
- jacobian_enabled_solvers = ['BFGS', 'Conjugate gradient (Fletcher-Reeves imp.)', 'Conjugate gradient (Polak-Ribiere imp.)', 'Damped GaussNewton', 'Levenberg-Marquardt', 'Levenberg-MarquardtMD', 'SteepestDescent', 'Trust Region']
Within the controller class, you must define the list
jacobian_enabled_solvers
if any of the minimizers for the specific software are able to use jacobian information.jacobian_enabled_solvers
: a list of minimizers in a specific
software that allow Jacobian information to be passed into the fitting algorithm
- setup()
Setup problem ready to run with Mantid.
Adds a custom function to Mantid for calling in fit().