NAGnews 177
Solving QCQP problems – new in the NAG Library

Quadratic functions are a powerful modelling construct in mathematical programming and appear in various disciplines such as statistics, machine learning (Lasso regression), finance (portfolio optimization), engineering (OPF) and control theory. At Mark 27.1 of the NAG Library, NAG introduced two new additions to the NAG Optimization Modelling Suite to help users easily define quadratic objective functions and/or constraints, seamlessly integrate them with other constraints and solve the resulting problems using compatible solvers without the need of a reformulation or any extra effort.

How to access the new NAG Library functionality

As with all new releases, we encourage NAG Library users to upgrade to the latest Mark to access the new content and performance improvements. NAG Library downloads are available here. The new Mark 27.1 functionality is also available in the NAG Library for Python.

If you don’t have access to the NAG Library and you’d like to try the new functionality, we offer full product trials. If you have any questions or need help, do get in touch with our Technical Support team

Webinar: A practical perspective on Quantum Computing

Webinar: A practical perspective on quantum computing – 22 April 2021

Join us as we break down the Quantum Computing (QC) hype, discuss unique properties inherent to quantum systems, and survey fundamental QC algorithms.

Mathematical optimization problem? Which solver should I use?

Matching the right optimization solver to the problem in-hand is crucial for mathematical optimization. Learn more about the process (and what can go wrong) in the latest NAG Optimization Corner blog series by NAG Software Engineer, Andrew Sajo.

Optimization Corner
Best Practice Scalable Machine Learning Collaboration Centre Progress

How to quickly train an AI model at scale using Azure Machine Learning

In February 2021 NAG announced its participation in the Azure HPC & AI Collaboration Centre - NAG provides machine learning expertise to Microsoft Azure users via the new Centre, and helps develop best practices for the deployment of scalable machine learning in the Cloud.

The first exciting centre output has resulted in a tutorial by NAG’s Phil Tooley that demonstrates how to quickly train an AI model at scale using Azure Machine Learning

Catch the recording

Calibrate models faster and improve performance 20%+ with DFO!

If you missed our March 2021 webinar you can catch a recording of ‘Calibrate models faster and improve performance 20% with DFO’.

Webinar summary

Calibrating the parameters of complex models can be difficult, particularly when the model is both expensive and noisy, in such cases the model derivatives are not readily available. Join NAG colleagues Jan Fiala and Benjamin Marteau as they present Derivative-free Optimization (DFO), a novel technique to tackle optimization without derivatives, demonstrated on the stochastic volatility Heston with term structure model.