Learn Real-world Algorithmic Differentiation from AD Pioneers

NAG is delighted to present the online Algorithmic Differentiation (AD) Masterclass Series. NAG are pioneers in the industrial application of AD. The aim of the AD Masterclass Series is to share the best practice, software engineering issues, optimization techniques and common pitfalls we’ve learned from over a decade in the front lines, applying AD to real-world codes. The series will deepen your AD knowledge and show you how to move beyond toy examples and PoCs.

AD Masterclass attendees will learn:
  • What AD is, its impact on the world and what it means for your codes
  • How to compute derivatives using tangent and adjoint AD
  • How to set up a rigorous testing harness, and the software engineering implications of this
  • How to exploit SIMD vectorization
  • How to bootstrap a validated adjoint on a real-world code
  • How to speed up your adjoint solution and reduce its memory use

The Masterclass will focus on NAG’s AD tool dco/c++, but the topics are essentially universal. 

Attendees will receive access to all code examples used, together with dco/c++ trial licences.

The online AD Masterclass Series dates and detail:

Please register for the AD Masterclass Series. Registering ensures that if you can't make a class for any reason, you'll receive a recording by email.

Why the need for Algorithmic Differentiation? | 30 July 2020
AD is changing our world more rapidly than ever. Why is this? We introduce the two basic models of AD and demonstrate how they are used to compute exact derivatives. We compare speed and accuracy with finite differences (bumping) and show some real-world examples of how AD has been used in science and engineering. 

How AD works: computing Jacobians | 6 August 2020
We dig into the fundamentals of AD, showing how it works and what the implications are, including for 3rd party libraries. We examine first order tangents and adjoints in some detail and look at performance and memory use. Adjoint memory use, if untamed, is prohibitive. Fortunately, there are ways to control the memory use comprehensively. 

Testing and validation | 13 August 2020
Why should we test AD codes? We highlight common problems and pitfalls, including non-differentiability. We show how to incorporate AD testing into your test harness. Doing this has far-reaching software engineering implications for code maintenance and build systems. We highlight some of the options available and share best practice. 

Pushing performance using SIMD vectorization | 20 August 2020
Why bother with tangent mode? There are many reasons! Having a fast tangent is good news for everyone, and we’ll explain why. We show how running tangent and adjoint codes in vector mode together with SIMD vectorization can give a healthy speedup.

Bootstrapping validated adjoints on real-world codes | 27 August 2020
The essential first step before doing any code optimization is having a correctness test in place. The same holds for adjoints: before applying any adjoint optimization techniques, we need a test harness which validates the inner product identity. However this requires a working adjoint, and we know a brute-force approach is likely to exhaust memory. How can this be resolved? We discuss several options for bootstrapping a validated adjoint, including different tape types and using Jacobian pre-accumulation. This latter technique can be very effective. Once we have a validated adjoint, the stage is set for applying more advanced adjoint optimizations such as checkpointing, symbolic adjoints, external adjoints, etc. We will discuss these more advanced topics in upcoming webinars.