NAG AD Library e04kf_a1w_f (handle_solve_bounds_foas_a1w)
Note:a1w denotes that first order adjoints are computed in working precision; this has the corresponding argument type nagad_a1w_w_rtype.
Also available is the t1w (first order tangent linear) mode, the interface of which is implied by replacing a1w by t1w throughout this document.
Additionally, the p0w (passive interface, as alternative to the FL interface) mode is available and can be inferred by replacing of active types by the corresponding passive types.
The method of codifying AD implementations in the routine name and corresponding argument types is described in the NAG AD Library Introduction.
The routine may be called by the names e04kf_a1w_f or nagf_opt_handle_solve_bounds_foas_a1w. The corresponding t1w and p0w variants of this routine are also available.
is the adjoint version of the primal routine
e04kff is a solver from the NAG optimization modelling suite for bound-constrained large-scale
Nonlinear Programming (NLP)
problems. It is a first-order active-set method (FOAS) that has low memory requirements and thus is suitable for very large-scale problems.
For further information see Section 3 in the documentation for e04kff.
Dai Y-H and Kou C-X (2013) A Nonlinear Conjugate Gradient Algorithm with an Optimal Property and an Improved Wolfe Line Search SIAM J. Optim.23(1) 296–320
Gill P E and Leonard M W (2003) Limited-Memory Reduced-Hessian Methods for Large-Scale Unconstrained Optimization SIAM J. Optim.14(2) 380–401
Hager W W and Zhang H (2005) A New Conjugate Gradient Method with Guaranteed Descent and an Efficient Line Search SIAM J. Optim.16(1) 170–192
Hager W W and Zhang H (2006a) Algorithm 851: CG DESCENT, a Conjugate Gradient Method with Guaranteed Descent ACM Trans. Math. Software32(1) 113–137
Hager W W and Zhang H (2006b) A New Active Set Algorithm for Box Constrained Optimization SIAM J. Optim.17(2) 525–557
Hager W W and Zhang H (2013) The Limited Memory Conjugate Gradient Method SIAM J. Optim.23(4) 2150–2168
Nocedal J and Wright S J (2006) Numerical Optimization (2nd Edition) Springer Series in Operations Research, Springer, New York
In addition to the arguments present in the interface of the primal routine,
e04kf_a1w_f includes some arguments specific to AD.
A brief summary of the AD specific arguments is given below. For the remainder, links are provided to the corresponding argument from the primal routine.
A tooltip popup for all arguments can be found by hovering over the argument name in Section 2 and in this section.