Derivative Free
Derivative Free methods do not compute the gradient of a function and so are often used to minimize problems with nondifferentiable functions. Some derivative free methods will attempt to approximate the gradient using a finite difference approach, another class of methods constructs a linear or quadratic model of the objective functions and uses a trust region approach to find the next iterate. Another widely used derivative free method is the Nelder-Mead simplex method. [Nocedal]
To select all minimizers in fitbenchmarking that use derivative free methods, use the algorithm type deriv_free
.
Simplex (simplex
)
Nelder-Mead is a simplex based algorithm, with a simplex \(S\) in \({\rm I\!R}\) being defined as the convex hull of \(n+1\) vertices \(\{x_1, ..., x_{n+1}\}\).
In an iteration of the algorithm, the idea is to remove the vertex with the worst function value. It is then replaced with another point with a better value. An iteration consists of the following steps:
Ordering the vertices of \(S\) so that \(f(x_1) \leq f(x_2) \leq ... \leq f(x_{n+1})\)
Calculating the centroid, \(\bar{x}\) of the best \(n\) points \(\bar{x} = \frac{1}{n} \sum_{i=1}^n x_i\)
Carry out a transformation to compute the new simplex. Try to replace only the worst vertex \(x_{n+1}\) with a better point, which then becomes the new vertex. If this fails, the simplex is shrunk towards the best vertex \(x_1\) and \(n\) new vertices are computed. The algorithm terminates when the simplex \(S\) is sufficiently small. [Singer]
- Advantages:
Method only requires 1 or 2 functions evaluations per iteration.
Gives significant improvements in first few iterations - quick to produce satisfactory results.
- Disadvantages:
Stagnation can occur at non-optimal points, with large numbers of iterations leading to negligible improvement even when nowhere near a minimum.
If numerical computation of function derivative can be trusted, then other algorithms are more robust.