Calibrating the parameters of complex numerical models to fit real-world observations is one of the most common problems found in industries such as finance, physics, simulations, engineering, etc. NAG introduces at Mark 27.1 of the NAG Library a novel nonlinear least squares (NLN-LSQ) trust-region solver handle_solve_bxnl (e04gg) for unconstrained and bound-constrained fitting problems which implements various algorithms and regularization techniques. It is aimed at small to medium sized fitting problems (up to 1000s of parameters) of the form
where the idea is to find optimal values for the parameters, , of the model represented by the smooth function (blue line) which fit observed data points (black dots). That is, to minimize the squared error between the model and the data (vertical red bars).
e04gg should present a significant improvement over the current nonlinear least squares solvers in the NAG Library. In addition, this solver fills the gap between unconstrained solvers such as lsq_uncon_quasi_deriv_comp (e04gb) and the fully constrained ones, such as lsq_gencon_deriv (e04us).
The new solver stems from a collaboration with the Rutherford Appleton Laboratory  and demonstrates NAG’s ongoing effort to expand and improve its offering in mathematical optimization.
- Well established trust-region method with a variety of implemented algorithms, ranging from simple Powell’s dogleg or Gauss-Newton methods to sophisticated Tensor–Newton schemes which overcome convergence difficulties present in simpler methods;
- Avoids data over-fitting by incorporating different types of regularization for both the problem formulation and the trust-region subproblem;
- Ability to recover when it is not possible to evaluate the function or its gradient at a given point;
- Account for uncertainty in the observed data by using optional residual weights;
- Flexible stopping criteria that can suit the problem and data tolerances.
The new solver e04gg offers unprecedented robustness and a significant speed-up over current alternatives in the Library, namely e04gb for unconstrained nonlinear least squares problems and e04us for problems with simple variable bounds. You are highly recommended to upgrade to the new solver.
Our benchmarks comparing e04gg to e04gb on 68 unconstrained nonlinear least squares CUTEst problems are reported in Figure 1 (a)–(c) using performance profiles. The three plots show that the new solver is faster: solving 60% of the problems in less time (a) and is more robust, solving 25% more problems. Requires fewer user callbacks: 55% of problems require fewer function calls (b) and 65% require fewer gradient evaluations (c).
As the e04gb solver was not designed to tackle simple bounds directly, typically e04us would be used instead for such problems. However, e04us being a more general solver does not fully exploit the structure of NLN-LSQ problems as e04gg does in the presence of simple bounds. That’s why a speed-up on 45% of problems can be observed (d) as well as a reduction in user callbacks: 65% of problems require fewer function and gradient calls (e and f) when comparing e04us and e04gg on 112 unconstrained and bound-constrained nonlinear least square CUTEst problems.
Real-world example: a particle track data-fitting
The following example illustrates the usage of e04gg to fit PADC etched nuclear track data to a convoluted distribution. A target sheet is scanned and track diameters are recorded (red wedges in left image of Figure 2) into a histogram (blue bars in right plot of Figure 2) and a mixed Normal and log-Normal model is to be fitted to the obtained experimental histogram. e04gg is used to fit the six parameter model
using the histogram heights. Thanks to the use of regularization and residual weights, e04gg provided a robust solution to unfold the parameters for the two distributions (red and blue curves in right plot of Figure 2). Adding these together produce the green curve which is the one used to perform the fitting. Data and complete python source code for this example are available here.
For further examples and reading visit our GitHub Local Optimization page.
 Gould N I M, Rees T, and Scott J A (2017) A higher order method for solving nonlinear least-squares problems. Technical report, RAL-P-1027-010 RAL Library. STFC Rutherford Appleton Laboratory. Link to report.
 Kanzow C, Yamashita N, and Fukushima M (2004) Levenberg-Marquardt methods with strong local convergence properties for solving nonlinear equations with convex constraints. Journal of Computational and Applied Mathematics 174 375–397.