Active-set Method for Nonlinear Optimization

One of the key algorithms to address general constrained nonlinear optimization problems is an active-set sequential quadratic programming (SQP) method. It is a workhorse for many commercial and open-source solvers. The NAG® Library offers a state-of-the-art version within the Optimization Modelling Suite. It complements NAG's interior point method (IPM) and other specialized methods for nonlinear optimization such as Derivative Free Optimization (DFO) or first order method.

Solver Documentation
Benefits
Gears background 1
Outstanding Performance

Best algorithm for problems with many constraints

Monitor background 1
Robust

Reliable handling of hard or infeasible problems and problems with missing derivatives

Settings background v1
Efficient for All Problem Sizes

Covers both dense (small and medium scale) as well as sparse (large scale) problems

Puzzle background 1
Warm Starts

Solve efficiently a sequence of similar problems

Applications

NLP is commonplace across many industries and is used within real-world applications, such as optimal control, for example, trajectory optimization, robotics and other optimization of processes described by differential equations, engineering, finance or manufacturing amongst many myriads of examples.

How the Active-Set Method Works

Our active-set nonlinear solver solves at each iteration a quadratic approximation of the original problem and it tries to estimate which constraints need to be kept (are active) and which can be ignored. A practical consequence is that the algorithm partly “walks along the boundary” of the feasible region given by the constraints. The iterates are thus early on feasible with regard to all linear constraints (and a local linearization of the nonlinear constraints) which is preserved through the iterations. The complementarity is satisfied by default and once the active set is determined correctly and optimality is within the tolerance, the solver finishes. The number of iterations might be high, but each is relatively cheap. 

Important Update of the Sparse SQP Solver at Mark 28.4

Performance and stability improvements – Benchmark studies on 826 NLP (CUTEst) problems show that the new handle_solve_ssqp (e04sr) is faster than the solver it replaces nlp2_sparse_solve (e04vh) in at least 10% of problems. It also revealed that e04sr is more robust that e04vh, solving 6.7% more problems. Specifically, it reveals that it is 25% faster in 13% of the problems and twice as fast on 5.7% of the problems. For the hardest 10% of the problems, e04sr is 1.25 times faster in 33% of those.

NAG Library Sparse SQP Solver

 

New intuitive interface – the NAG® Optimization Modelling Suite provides an easy to use interface for the solver including an extended modelling feature with a rich API to define problems and allowing for easy switching between different algorithms as and when required.