The latest Mark of the NAG® Library is available to download today. Mark 27.3 features a symbolic Nearest Correlation Matrix (NCM) solver, FEAST Eigensolver, and more second-order automatic differentiation (AD) solvers.
The new NCM solver for computing the symbolic adjoint of the nearest correlation matrix (NCM). Computing derivatives of the NCM allows sensitivities to the input data to be found. The symbolic adjoint computes the derivative mathematically, resulting in a routine that is 70 times faster and uses 2500 times less memory than the algorithmic adjoint. The symbolic adjoint is accessed via a new mode for the solver g02aa. Previously this could be done in the NAG AD Library by computing the algorithmic adjoint, which differentiates the code line by line.
How to access the new NAG® Library functionality
As with all new releases, we encourage NAG® Library users to upgrade to the latest Mark to access the new content and performance improvements. NAG® Library downloads are available here. If you don’t have access to the NAG® Library and you’d like to try the new functionality, we offer full product trials. If you have any questions or need help, do get in touch with our Technical Support team.
For Faster, Cheaper, Better, Cloud Applications
Learn why cost-to-solution is the best way to achieve performance and flexibility for HPC on Cloud. Join NAG’s team of experts as they discuss how cloud computing can drive varying hardware choices, the secrets to achieving lower cost computing, and how to turn your applications up to eleven, delivering results as quickly as possible when time is of the essence.
NAG dco/c++ is an automatic differentiation tool for computing sensitivities of C++ codes. The tool is easy to use, can be applied quickly to a code base, and integrates with build and testing frameworks. The latest release (dco/c++ v3.5.0) features:
- Better resizing strategy in adjoint vector (and speedup of a couple of percent)
- Unified source version for Windows and Linux
- CMake system instead of make / nmake for building examples / case studies
- Various bugfixes
Want to learn more about Automatic Differentiation?
NAG is delighted to present an online Automatic Differentiation (AD) Masterclass Series. Through a series of masterclasses, NAG shares best AD practice, software engineering issues, optimization techniques and common pitfalls we’ve learned from over a decade in the front lines applying AD to real-world codes. The series will deepen your AD knowledge and show you how to move beyond toy examples and PoCs – take the NAG AD Masterclass.
- The Trading Show Chicago 2021
- Webinar: What’s New With the Nearest Correlation Matrix
- Training: Introduction to Modern Fortran
- Webinar: Solving the HPC Cost to Solution Puzzle
- Training: Introduction to MPI
- Training: Introduction to OpenMP
- Training: Introduction to CUDA