# library.glopt Submodule¶

## Module Summary¶

Interfaces for the NAG Mark 27.1 glopt Chapter.

glopt - Global Optimization of a Function

Global optimization involves finding the absolute maximum or minimum value of a function (the objective function) of several variables, possibly subject to restrictions (defined by a set of bounds or constraint functions) on the values of the variables. Such problems can be much harder to solve than local optimization problems (which are discussed in submodule opt) because it is difficult to determine whether a potential optimum found is global, and because of the nonlocal methods required to avoid becoming trapped near local optima. Most optimization functions in the NAG Library are concerned with function minimization only, since the problem of maximizing a given objective function is equivalent to minimizing . In bnd_mcs_solve(), bnd_pso() and nlp_pso(), you may specify whether you are solving a minimization or maximization problem; in the latter case, the required transformation of the objective function will be carried out automatically. In what follows we refer exclusively to minimization problems.

This introduction is a brief guide to the subject of global optimization, designed for the casual user. For further details you may find it beneficial to consult a more detailed text, see Neumaier (2004). Furthermore, much of the material in the E04 Introduction is also relevant in this context and it is strongly recommended that you read the E04 Introduction.

naginterfaces.library.examples.glopt :

This subpackage contains examples for the glopt module. See also the Examples subsection.

## Functionality Index¶

Nonlinear programming (NLP) – global optimization

bound constrained

branching algorithm, multi-level coordinate search: bnd_mcs_solve()

heuristic algorithm, particle swarm optimization (PSO): bnd_pso()

generic, including nonlinearly constrained

heuristic algorithm, particle swarm optimization (PSO): nlp_pso()

multi-start: nlp_multistart_sqp()

Nonlinear least squares, data fitting – global optimization

generic, including nonlinearly constrained

Service functions

option setting functions

bnd_mcs_solve()

initialization: bnd_mcs_init()

check whether option has been set: bnd_mcs_option_check()

retrieve character option values: bnd_mcs_optget_char()

retrieve integer option values: bnd_mcs_optget_int()

retrieve real option values: bnd_mcs_optget_real()

supply character option values: bnd_mcs_optset_char()

supply integer option values: bnd_mcs_optset_int()

supply option values from character string: bnd_mcs_optset_string()

supply option values from external file: bnd_mcs_optset_file()

supply real option values: bnd_mcs_optset_real()

supply option values from a character string: optset()

retrieve option values: optget()

For full information please refer to the NAG Library document

https://www.nag.com/numeric/nl/nagdoc_27.1/flhtml/e05/e05intro.html

## Examples¶

naginterfaces.library.examples.glopt.bnd_mcs_solve_ex.main()[source]

Global optimization by multi-level coordinate search.

>>> main()
naginterfaces.library.glopt.bnd_mcs_solve Python Example Results.
Global optimization of the Peaks objective function.
Final objective value is -6.55113
Global optimum is (0.22828, -1.62553)

naginterfaces.library.examples.glopt.nlp_multistart_sqp_lsq_ex.main()[source]

Global optimization of a sum of squares problem using multi-start.

Demonstrates catching a NagAlgorithmicWarning and accessing its return_data attribute.

>>> main()
naginterfaces.library.glopt.nlp_multistart_sqp_lsq Python Example Results.
Minimizes the sum of squares function
based on Problem 57 in Hock and Schittkowski (1981).
Solution number 1.
Final objective value =       0.0142298.