NAGnews 124 | 7 August 2014

In this issue


NAG Toolbox for MATLAB® updated to Mark 24 – new functionality available to users


The NAG Toolbox for MATLAB® has been updated to reflect the new mathematical and statistical routines included in Mark 24 of the NAG Library. The NAG Toolbox is a comprehensive set of interfaces to the NAG Library. The NAG Library has a global reputation for its excellence and contains over 1500 routines making it the largest commercially available collection of mathematical and statistical algorithms. Its capabilities complement those of MATLAB itself and provide functionality not otherwise available, or only available by buying several specialist toolboxes.

The key new functionality at Mark 24 includes:

  • Confluent Hypergeometric Function
  • Two-stage Spline Approximation to Scattered Data
  • Further additions to Nearest Correlation Matrix
  • Multi-start Optimization
  • Optimization for Non-negative Least Squares
  • Matrix Functions
  • Inhomogeneous Time Series
  • Gaussian Mixture Model
  • Best subset

We strongly encourage that users of the NAG Toolbox for MATLAB upgrade their existing software to Mark 24 - you can do this here. Many NAG Library users are entitled to use the NAG Toolbox due to its inclusion in their software licence. If you think this could be you, contact us and we will check your licence. 30 day trials are also available for the product.


Can you trust your algorithms? New article - Your algorithms may be misleading you - Use Algorithmic Differentiation to find out


Algorithms are critical to how we interact with data and as the volume and variety of data increases, so does our reliance on algorithms to give us the answers we seek. But how much faith should you put into those algorithms, and how can you be sure they're not misleading you? They're not simple questions, but through the use of algorithmic differentiation techniques, data scientists can get more precise answers.

Algorithmic differentiation (AD), sometimes called automatic differentiation, is a technique used to ascertain the precision of algorithms based on a certain data set, and to determine the algorithms susceptibility to data volatility. The concepts behind AD originated decades ago in the geology and meteorology fields, and were used to help boost the effectiveness of HPC codes used to predict the weather or tell energy companies where to drill for oil.

AD has proved its value in a variety of cases where the accuracy of data is critical to the achievement of goals. If the results of an AD test show that a data model breaks down when presented with real-life data inputs, then the owner may scrap the model and start over. Conversely, if AD shows that a model works even with dirtier data, then the owner may be able to save money by dialling down the precision of data collection, such as with a sensor on a weather satellite.

Today, AD is used in a variety of industries where HPC is prevalent, including aerospace and auto manufacturing, where it is used to optimize the algorithms that determine shapes of wings and car bodies, and in finance, where it is used to fine tune the algorithms that compose option pricing models, for example.

But as the big data analytics phenomenon drives forward and smaller outfits start experimenting with data mining, AD proponents are concerned that some of the hard-won lessons of AD are not trickling down as quickly as they might.

Read the full article published in Datanami here.


New Case Study: Using the NAG Library to solve a General Equilibrium Model with Endogenous Stock Market Non-participation


A significant fraction of the population in any country with a liquid stock market do not hold stocks, despite the international evidence on a historically high equity risk premium. Whereas the phenomenon of Limited Stock Market Participation has been used to explain several other asset pricing puzzles, only a few studies address the equilibrium foundation for non-participation. Yet, to understand the sources of non-participation is of great importance for portfolio managers, regulators and academics. We develop a consumption based asset pricing model, in the spirit of Lucas (1978), with incomplete financial markets. The model incorporates heterogeneous endowments, heterogeneous labour income processes, and heterogeneous preferences with external additive habit formation, also known as "Catching up with the Joneses," that is: investors maximize life-time utility over surplus consumption. One investor serves as the representative stand-in for all stockholders, whereas the second would-be investor represents potential non-participants.

Read the full case study here.


Mark 24 New Functionality Spotlight: Confluent and Gauss Hypergeometric Functions (1F1 and 2F1)


Included at Mark 24 of the NAG Library are routines for the evaluation of the Confluent hypergeometric function and the Gauss hypergeometric function, with real valued parameters and a real argument. A mini-article has been authored to explain in more detail the use of these new routines. You can read it here.


Training Courses & Events


Course, Date & Location

  • Fortran 95
    19 August 2014, Manchester, UK
  • Essential HPC for Buyers, Managers, and R&D Leaders
    8 September 2014, Chicago, USA
  • Essential HPC for Buyers, Managers, and R&D Leaders
    16 September 2014, London, UK
  • Multicore Programming with OpenMP
    14 October 2014 London, UK
  • An Introduction to CUDA Programming
    21 October 2014 London, UK
  • Multicore Programming with OpenMP
    5 November 2014 Houston, USA
  • An Introduction to OpenCL Programming
    10 November 2014 Chicago, USA

NAG will be at the following exhibitions and conferences over the next few months.


The Best of the Blog


Secrets of HPC Procurement

Liked my article today in HPC Wire, "Secrets of the Supercomputers"? I firmly poke fun at various elements of an imaginary supercomputer procurement process. However, I'm sure many readers will also see familiar and painfully serious aspects in the fictional story.

As I mention at the bottom of that article, NAG can help make the process of buying HPC systems much better than the worrying case in the article.

For example, the tutorial I ran at SC13 with Terry Hewitt titled "Effective Procurement of HPC Systems" was very popular (~100 attendees). We have provided similar training as part of consulting engagements and we are now looking at running the tutorial again as an open short course.

NAG can also provide other training courses (we did a press release on our upcoming schedule today) including "Essential HPC for Buyers, Managers, and R&D Leaders".

Read the full blog post here.


NAGnews - Past Issues


We provide an online archive of past issues of NAGnews. For editions prior to 2010, please contact us.

Website Feedback

If you would like a response from NAG please provide your e-mail address below.

(If you're a human, don't change the following field)
Your first name.
CAPTCHA
This question is for testing whether you are a human visitor and to prevent automated spam submissions.
Image CAPTCHA
Enter the characters shown in the image.