NAGnews 114 | 30 May 2013
In this issue
- NAG announces the latest version of the NAG Library with extensive new functionality
- New functionality spotlight: New Global Optimization
- White Paper - Local Volatility FX Basket Option on CPU and GPU
- NAG's HPC Team improve performance and capability of Turbulence Flow Solver
- Latest NAG Prize Winners - University of Manchester
- Events and Training Courses
- The best of the blog
New NAG Fortran Library Updated to Mark 24 - now available
We are delighted to announce availability of the latest Mark of the NAG Fortran Library. The Fortran Library now contains over 1,750 mathematical and statistical algorithms with over 130 new routines being added at this update. Exciting new functionality includes:
- Multi-start (global) Optimization
- Non-negative Least Squares
- Nearest Correlation Matrix
- Inhomogeneous Time Series
- Gaussian Mixture Model
- Confluent Hypergeometric Function (1F1)
- Brownian Bridge
- Best Subsets
- Real Sparse Eigenproblems
- Matrix Functions
- Two Stage Spline Approximation
In this and future issues of NAGnews we will shine a spotlight on the various new routines to give you insight into the purpose and background of a new routine's inclusion in the Library. Today we focus on the new Multi-start (global) Optimization routine.
Read the mini article on Multi-start (global) Optimization here, or below.
If you're an existing, supported, NAG Fortran Library user we urge you to upgrade to the latest Mark. Using the latest versions of our Libraries means you'll have access to all the exciting new functionality. We are happy to guide users through upgrade if necessary. Contact us for more information.
New Multi-start (global) Optimization in the NAG Library
Whereas routines in the NAG Optimization Chapter (E04 - Minimizing or Maximizing a Function) seek to find a local optimum, the Global Optimization Chapter (E05 - Global Optimization of a Function) contains routines which attempt to find the best of these local optima, the global minimum. Such problems are in general very difficult, especially if, as is often the case, the optimization is accompanied by nonlinear constraints on the solution.
Algorithms have been proposed to address such problems, many of which cater for functions which are not known to a high degree of accuracy or which are non-smooth. Such algorithms typically take very many function evaluations and hence are liable to take a long time to run.
When the functions are smooth a very practical means of addressing the global optimization problem is to use a local optimization routine but start it from many different points. If the best of the resulting solutions is returned, then we might be willing to accept this as a global solution. Of course there is no guarantee that a global minimum has been found, but increased confidence might be gained by using more starting points or by running the program again with different starting points.
Frequently this technique will prove to be faster and as reliable as other methods. We offer two multi-start routines in the NAG Library. One routine uses a sequential QP algorithm for finding the local minimum of a general nonlinear function subject to linear, nonlinear and simple bound constraints. The user has the option to return not just the best solution, but any specified number of solutions, ordered to reflect the value of the objective at the minimum. Such a facility might be useful if there are other criteria, not specified in the mathematical formulation, that the user would like in an acceptable solution. An example might be a solution that is 'good', but stable in the sense that small changes in the variables does not cause a large change in the objective. So, for example, small errors in manufacturing a part may still give acceptable performance of the assembled product.
The NAG Library also contains a multi-start algorithm, based upon a nonlinearly constrained, nonlinear sum of squares problem. This too has the option to return the best few solutions. Typically nonlinearly constrained sum of squares problems occur when fitting models to observed data.
New White Paper: Local Volatility FX Basket Option on CPU and GPU
NAG's Jacques du Toit and latest NAG Prize Winner Isabel Ehrlich have co-authored a new white paper 'Local Volatility FX Basket Option on CPU and GPU'.
Financial markets have seen an increasing number of new derivatives and options available to trade, many of which don't have closed form pricing formulae. One way in which we can price options without a closed form solution is by Monte Carlo simulation, however this is often not the preferred method of pricing since it is computationally demanding. The advent of massively parallel computer hardware (multicore CPUs with AVX, GPUs, Intel Xeon Phi) holds great promise for Monte Carlo methods since these methods are ideally suited to the hardware. For many financial products, Monte Carlo methods can now give prices fast enough to be useful to a financial institution's activities.
We examine a traditionally more demanding product, namely an FX basket option driven by a multi-factor local volatility model. Such options are often considered too complicated to tackle analytically in a market-consistent manner, and are too high dimensional for PDE methods. Consequently they are frequently valued using Monte Carlo methods. This results in a compute intensive, massively parallel problem which is ideally suited to modern CPUs and GPUs. We develop fully parallelized, fully vectorized code on a top end Intel CPU and NVIDIA GPU and study the effects of mixed precision on accuracy and performance. We also investigate using texture memory on the GPU.
Read the white paper here.
Latest NAG Prize Winners - University of Manchester
Craig Lucas, Senior Technical Consultant at NAG's attended the recent SIAM Student Chapter Conference at the University of Manchester. The photo below shows Craig awarding student Sophia Bethany Coban with a prize for best student talk at the conference. Sophia presented 'Mathematical Modelling of X-Ray Computed Tomography'. Fellow student, Thomas Slater also won a prize for his poster 'Chemically Sensitive 3D Imaging of Nanoparticles'. Congratulations to Sophie and Thomas!
NAG's HPC Team improve performance and capability of Turbulence Flow Solver
An HPC expert from NAG, working under NAG's Computational Science and Engineering (CSE) support service for HECToR, the UK's national academic supercomputing facility, has successfully introduced the overlap of communication and computation capability in a Computational Fluid Dynamics code which is used to conduct state-of-the-art turbulence studies.
Incompact3D is an in-house computational fluid dynamics code used by the Turbulence, Mixing and Flow Control Group at Imperial College and its academic collaborators to conduct state-of-the-art turbulence studies. The work follows on from two previously successful dCSE Incompact3D projects that significantly improved the scalability of the code, enabling it to use more than 10,000 cores on HECToR for production runs. However, as more cores are used, time spent communicating between nodes becomes more prominent, indicating that there is room for further development.
The aim of this project was to reduce the time taken in all-to-all communications, which is particularly important, when utilising large core counts on HECToR. This was accomplished by further developing the 2D decomposition and Fast Fourier Transform library, 2DECOMP&FFT to perform computations that overlap across-node communications.
Commenting on the dCSE project success, Dr Sylvain Laizet of Imperial College London said: "When the recent update of HECToR increased the number of cores per processor to 32, we observed that the communication within a processor could be a limiting factor for running large simulations with Incompact3D. With dCSE software support, further development of the code reduced the time taken by the all-to-all communications in Incompact3D by about 15%. By using a library solution, 2DECOMP&FFT, there was little impact on end users. Our code Incompact3D is now available as an open source code through a Google project with about 75 new users worldwide since November 2012, opening many possible collaborations within the UK.? These three successful dCSE projects related to Incompact3D now enable us to perform highly accurate large scale simulations in conjunction with our experimental results that form a central component of our research effort. As such, the added value is critical."
Events & Training Courses
2nd June 2013, University of Ottawa
One of NAG's Senior Technical Consultants, Craig Lucas, will be providing training in the Tutorials part of the event. Craig will present "Introduction to Parallel I/O" which introduce MPI-IO, a technique that allowsa program to read and write to a single file from multiple processes. It will also cover the parallel I/O libraries NetCDF and HDF5, which are built on top of MPI-IO.
- International Supercomputer Conference
16th-20th June 2013, Leipzig
NAG is a regular exhibitor at Europe's main HPC event and this year is no exception. NAG will be available to talk about our established HPC services and showcasing the new functionality in the NAG Library.
24-25 June 2013, Chicago
NAG staff will be present at QuantInvest Chicago and will be hosting The NAG Clinic for those seeking a "Cure for the Common Code".
- Numerical Analysis Conference
25-28 June 2013, University of Strathclyde
NAG staff will be present at this Numerical Analysis event that comprises a number of invited speakers and minisymposia sessions.
- Advanced Risk & Portfolio Management Bootcamp
12th-17th August 2013, New York
NAG is sponsoring this 'bootcamp' course for risk and portfolio management practitioners.
Training Courses Provided by NAG's HECToR Team*
- Accelerating Applications with CUDA 29-31 May 2013, University of Bath
- Parallel Programming with MPI 10-12 June 2013, University College London
These HPC training courses are provided free of charge to HECToR users and UK academics whose work is covered by the remit of one of the participating research councils (EPSRC, NERC and BBSRC). The courses are also open to non-eligible people but will require payment of a course fee. Please see the eligibility page for more details.
Recent blog posts
Keep up to date with NAG's recent blog posts here:
Calling the NAG C Library from kdb+
Kx Systems was founded in 1993 to address a simple problem: the inability of traditional relational database technology to keep up with escalating volumes of data. This problem is even more acute today, where data records, especially in the financial industry, number in the trillions. To provide users to fast data, the founders of Kx have developed kdb+ and vector-processing programming language 'q'.
Kdb+ has the ability to quickly process data and for more complex analysis it has access to other programming languages. To call the NAG C Library from kdb+, a wrapper must be placed around the NAG routine, taking care when passing data types. The wrapper functions must then be compiled into an external library.
Read the full post by Brian Spector.
Evaluation of non-linear expressions using Excel
NAG provides VB headers for all the routines in the library allowing them to be called easily from Microsoft Excel given a little knowledge of Visual Basic. Along with the headers, some Excel examples are distributed with the product. You may also find demo worksheets for a variety of routines on the NAG web site.
Combining the ease of use of Excel and the power of NAG routines is a great way of creating simple, flexible interfaces for solving difficult problems. Typically you write VBA code to handle the reading and writing of data to the Excel worksheet and the passing of parameters to the NAG routines. Once this VBA code has been written the problem parameters can be easily changed and tweaked from the worksheet without having to alter any code again.
Routines which take only simple parameters such as scalar values, vectors and matrices can have all their arguments input via the Excel worksheet however for routines that require non-linear expression such as those routines with callback functions it isn't so straightforward. One method is to simply write the callback in VBA however you lose the interactivity gained from using Excel as every time you want to change the problem you would have to edit the VBA code. This isn't a big deal but there is another way!
Read the full post by Chris Seymour.
NAGNews - Past Issues