Cost functions
Fitbenchmarking supports multiple cost functions. These can be set via the cost_func_type
option in Fitting Options.
Fitbenchmarking is designed to work with problems that have the form
The function \(F(\cdot)\) is known as the cost function, while the function \(r(x,u,p)\) is known as the residual of the cost function. The residual will generally be zero if the fit was perfect. Both of these quantities together define a cost function in FitBenchmarking.
The cost functions that are currently supported are:
Non-linear least squares cost function
- class fitbenchmarking.cost_func.nlls_cost_func.NLLSCostFunc(problem)
This defines the non-linear least squares cost function where, given a set of \(n\) data points \((x_i,y_i)\), associated errors \(e_i\), and a model function \(f(x,p)\), we find the optimal parameters in the root least-squares sense by solving:
\[\min_p \sum_{i=1}^n \left(y_i - f(x_i, p)\right)^2\]where \(p\) is a vector of length \(m\), and we start from a given initial guess for the optimal parameters. More information on non-linear least squares cost functions can be found here.
Weighted non-linear least squares cost function
- class fitbenchmarking.cost_func.weighted_nlls_cost_func.WeightedNLLSCostFunc(problem)
This defines the weighted non-linear least squares cost function where, given a set of \(n\) data points \((x_i,y_i)\), associated errors \(e_i\), and a model function \(f(x,p)\), we find the optimal parameters in the root least-squares sense by solving:
\[\min_p \sum_{i=1}^n \left(\frac{y_i - f(x_i, p)}{e_i}\right)^2\]where \(p\) is a vector of length \(m\), and we start from a given initial guess for the optimal parameters. More information on non-linear least squares cost functions can be found here.
Hellinger non-linear least squares cost function
- class fitbenchmarking.cost_func.hellinger_nlls_cost_func.HellingerNLLSCostFunc(problem)
This defines the Hellinger non-linear least squares cost function where, given a set of \(n\) data points \((x_i,y_i)\), associated errors \(e_i\), and a model function \(f(x,p)\), we find the optimal parameters in the Hellinger least-squares sense by solving:
\[\min_p \sum_{i=1}^n \left(\sqrt{y_i} - \sqrt{f(x_i, p})\right)^2\]where \(p\) is a vector of length \(m\), and we start from a given initial guess for the optimal parameters. More information on non-linear least squares cost functions can be found here and for the Hellinger distance measure see here.
Poisson deviance cost function
- class fitbenchmarking.cost_func.poisson_cost_func.PoissonCostFunc(problem)
This defines the Poisson deviance cost-function where, given the set of \(n\) data points \((x_i, y_i)\), and a model function \(f(x,p)\), we find the optimal parameters in the Poisson deviance sense by solving:
\[\min_p \sum_{i=1}^n \left( y_i \left(\log{y_i} - \log{f(x_i, p)} \right) - \left( y_i - f(x_i, p) \right) \right)\]where \(p\) is a vector of length \(m\), and we start from a given initial guess for the optimal parameters.
This cost function is intended for positive values.
This cost function is not a least squares problem and as such will not work with least squares minimizers. Please use algorithm_type to select general solvers. See options docs (Fitting Options) for information on how to do this.
Weighted non-linear least squares cost function with log-likelihood evaluation
- class fitbenchmarking.cost_func.loglike_nlls_cost_func.LoglikeNLLSCostFunc(problem)
This defines the weighted non-linear least squares cost function for Bayesian fitting where, given a set of \(n\) data points \((x_i,y_i)\), associated errors \(e_i\), and a model function \(f(x,p)\), we find the optimal parameters in the root least-squares sense by solving:
\[\min_p \sum_{i=1}^n \left(\frac{y_i - f(x_i, p)}{e_i}\right)^2\]where \(p\) is a vector of length \(m\), and we start from a given initial guess for the optimal parameters. More information on non-linear least squares cost functions can be found here.
This cost function is intended for use with Bayesian fitting methods, where evaluation of the log-likelihood is required.